altgamer Posted December 11, 2007 Share Posted December 11, 2007 Hi, I run a site that has gaming files that exceed 200MB (mostly UT3 patches, etc), and I have had an oddball issue for the past few days that I have tried to figure out, but have failed. I recently put in a custom php file download system that more or less checks to see if the user can download the file since so many people were hot linking the file. If so, it starts the download while keeping the actual file location hidden. The problem I've run into, though, is that any files only download at 150KB/sec. Prior to using php, downloads would reach well over 10Mbps instead of 1.5Mbps.. I'm wondering, first of all, does PHP cause some kind of throttling for speed that forces 150KB/sec? I know it's not the server or my home line, cause I can load up 4 different browsers, and they are all rock steady at 149-150KB/sec. The other issue, is the download always seems to come to a halt around 43.1 MB downloaded, or very close to it. Firefox displays a "filename.zip could not be saved, because the source file could not be read. Try again later, or contact the server administrator." error, IE just says it can't be read, etc... The script I made just has the path to the filename, checks size, does a few headers, and then does readfile(filename) on it... ( I have also tried modifying it by using other stuff, such as: http://www.phpfreaks.com/forums/index.php/topic,108464.0.html .. but it still has the issue) $file = "C:/path/to/files/filename.zip"; $fname = "File.zip"; set_time_limit(0); header("Pragma: public"); header("Expires: 0"); header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Content-Type: application/force-download"); header("Content-Disposition: attachment; filename=\"$fname\""); header("Accept-Ranges: bytes"); header("Content-Length: ".filesize($file)); header("Content-Description: File Transfer"); @readfile($file); I have also tried "header("Content-Type: application/zip");", x-zip, x-zip-compressed, have tried binary encoding, etc... Does anybody know why this is occuring, and how I can fix it? Or does anybody know of a very simple download script? I can code all of the verification and that, I'm just trying to figure out how to go about hosting zip files, and using php to send them without having a speed throttle, etc. I don't understand why these two issues are occuring. I see other sites doing it easily all the time. :-/ I'm using PHP 5.2.3, on a dedicated server, with the windows server 2003 r2 operating system.. I've configured PHP the best way I know how, including trying to raise the timeout limits, size limits, etc... There are no other issues that I have had or can tell that could cause this. I've made double sure nothing in IIS is configured for throttling, or otherwise timing out as well. Soo.. if anybody can shed some light on this, it would be appreciated and a great help. Thanks. -alt Quote Link to comment Share on other sites More sharing options...
MadTechie Posted December 11, 2007 Share Posted December 11, 2007 1. Welcome to the board. The problem with readfile is that it read the whole file into memory.. which with large files can be a problem, 2. try this updated code <?php $file = "C:/path/to/files/filename.zip"; $fname = "File.zip"; set_time_limit(0); header("Pragma: public"); header("Expires: 0"); header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); #header("Content-Type: application/force-download");//removed #header("Content-Type: application/x-download");//try header('Content-Type: application/octet-stream'); //I used header("Content-Transfer-Encoding: binary");//added header("Content-Disposition: attachment; filename=\"$fname\""); header("Accept-Ranges: bytes"); header("Content-Length: ".filesize($file)); header("Content-Description: File Transfer"); //@readfile($file); readfile_chunked($file); //updated //added function readfile_chunked($filename,$retbytes=true) { $chunksize = 1*(1024*1024); // how many bytes per chunk $buffer = ''; $cnt =0; // $handle = fopen($filename, 'rb'); $handle = fopen($filename, 'rb'); if ($handle === false) { return false; } while (!feof($handle)) { $buffer = fread($handle, $chunksize); echo $buffer; ob_flush(); flush(); if ($retbytes) { $cnt += strlen($buffer); } } $status = fclose($handle); if ($retbytes && $status) { return $cnt; // return num. bytes delivered like readfile() does. } return $status; } ?> Quote Link to comment Share on other sites More sharing options...
altgamer Posted December 11, 2007 Author Share Posted December 11, 2007 Thanks for the reply. I tried the code, including some other alterations. I have also tried continuing to look around on here and other sites for answers, and so far the script keeps doing the same thing. It's as if theres some type of limit on php or something. I don't understand and can't figure out why the downloads are going so slow (150kb/s) compared to downloading the files directly. I'm wondering if that has something to do with why this keeps happening. And the files remain having errors around 40-45MB.. average 42MB... ??? The server has 4GB ECC ram, and has about 3GB free at all times, so I doubt it's running out. IIS etc from what I can tell is set correctly and all that. This is one of the first real problems I have ever come across, and the icing on the cake is it's just flat out odd. -alt Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.