garethhall Posted March 5, 2010 Share Posted March 5, 2010 Hi guys, I have a folder full of files (large ones 100mb and up). All the relevant file details as stored in a DB. So when the user clicks the download button it downloads the relevant file. However I can't seem to download files in rapid succession I have to wait for the first download to finish before I can start the next. I don't like this I think someone should be able to download more that one file at a time. I have tried every little thing I can thing of to find a work around to make this possible. I know have written 5 download scripts all using deferent methods the get the file. I have tried php ftp, fpassthru() print(fread()), I have tried to do a ajax call to the download scripts, call the download script directly (with href), tried to do it through an iframe, and lastly tried javascript to dynamically create a iframe on the fly and passing the variables to that frame. All of the mentioned methods works but still on one at a time as soon as you click the second link it just time out or do nothing? For the reason mentioned above I can't post all my code as it would not really make any sense. However I will include one the the download scripts. Does anyone know the solution? PLEASE HELP <?php require_once("../includes/conn.php"); require_once("../includes/shared.php"); $folder = comp($_GET['compID'],'compFolder'); /*** Get file information from DB ***/ function theFile($fID,$col){ $sel = "SELECT fileName,fileOrigName FROM files WHERE fileID =".cv($fID)." LIMIT 1"; $rs = mysql_query($sel); $rw = mysql_fetch_assoc($rs); return $rw[$col]; mysql_free_result($rs); } $fname = basename(theFile($_GET['fileID'],'fileName')); //$filename = $folder.'/'.$fname; send_file(comp($_GET['compID'],'compFolder'), theFile($_GET['fileID'],'fileName'), theFile($_GET['fileID'],'fileSize')); function send_file($path, $file, $size){ # Make sure the file exists before sending headers #------------------------------------------------- $mainpath = "../ql_uploads/".$path.'/'.$file; if(!$fdl=fopen($mainpath,'rb')){ print "<p><center>ERROR - Invalid Request (Downloadable file Missing or Unreadable)</center><br><br>".$mainpath; die; }else{ set_time_limit(0); # Send the headers then send the file #------------------------------------ header("Cache-Control: ");# leave blank to avoid IE errors header("Pragma: ");# leave blank to avoid IE errors header("Content-type: application/octet-stream"); header("Content-Disposition: attachment; filename=\"".$file."\""); header("Content-length:".(string)($size)); sleep(1); fpassthru($fdl); } return; } ?> Link to comment https://forums.phpfreaks.com/topic/194202-file-download/ Share on other sites More sharing options...
teamatomic Posted March 5, 2010 Share Posted March 5, 2010 Only way I know of is when a client starts a download a new window needs to be used to download another file simultaneously. So if you start the download and give them a link to download another file that opens in a new window you should be able to do it. After that I would think the limit would be the number of connections allowed by the server. HTH Teamatomic Link to comment https://forums.phpfreaks.com/topic/194202-file-download/#findComment-1021817 Share on other sites More sharing options...
garethhall Posted March 5, 2010 Author Share Posted March 5, 2010 My scripts work in that fashion to and no it doesn't work for me. Link to comment https://forums.phpfreaks.com/topic/194202-file-download/#findComment-1022080 Share on other sites More sharing options...
teamatomic Posted March 6, 2010 Share Posted March 6, 2010 Are you running mod_bandwidth or using IPtables? HTH Teamatomic Link to comment https://forums.phpfreaks.com/topic/194202-file-download/#findComment-1022203 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.