Sayre Posted July 22, 2011 Share Posted July 22, 2011 Hey everyone The issue I am having is that I setup my website to force download avi files however when I click a file that is Large in size (200mb) it says it cannot be found. If I click a avi file that's 50mb in size it works. Ive change the Php.ini max file upload size to 500m and it still doesn't work. I've been searching and searching the web and cannot find a solution. I'm running the latest version of MAMP. Any suggestions would be greatly appreciated if you need the code I can post it up this evening. Thank you! Quote Link to comment Share on other sites More sharing options...
dcro2 Posted July 22, 2011 Share Posted July 22, 2011 I'm sorry, are you talking about upload or download? Are you uploading files to your website or is your PHP script downloading a file? Quote Link to comment Share on other sites More sharing options...
Sayre Posted July 22, 2011 Author Share Posted July 22, 2011 Sorry about that, this is for downloading files from my site. I have it setup so when a user clicks a file Link it will prompt a download window to save the file. Quote Link to comment Share on other sites More sharing options...
dcro2 Posted July 22, 2011 Share Posted July 22, 2011 Ok, so I'm assuming you wrote a PHP script to set the header to force-download and then send the file to the browser? Can you post your code? Quote Link to comment Share on other sites More sharing options...
dcro2 Posted July 22, 2011 Share Posted July 22, 2011 It's probably because you're using readfile() and php is exceeding its memory limit. If you use apache and it has mod_xsendfile or lighttpd then you can just set the X-Sendfile header. Quote Link to comment Share on other sites More sharing options...
premiso Posted July 22, 2011 Share Posted July 22, 2011 If you don't want to mess with apache and enabling another module take a look here http://www.php.net/manual/en/function.readfile.php#88549 <?php function readfile_chunked ($filename,$type='array') { $chunk_array=array(); $chunksize = 1*(1024*1024); // how many bytes per chunk $buffer = ''; $handle = fopen($filename, 'rb'); if ($handle === false) { return false; } while (!feof($handle)) { switch($type) { case'array': // Returns Lines Array like file() $lines[] = fgets($handle, $chunksize); break; case'string': // Returns Lines String like file_get_contents() $lines = fread($handle, $chunksize); break; } } fclose($handle); return $lines; } ?> Reading it out in chunks should solve your memory issues. I dunno how this handles up to performance, so you will just have to test that part. Quote Link to comment Share on other sites More sharing options...
Sayre Posted July 22, 2011 Author Share Posted July 22, 2011 Thanks dcro2, I don't quite understand what that is but I will research it. This is the PHP script I am using; <?php // this is a relative path from this file to the // directory where the download files are stored. $path='files'; // first, we'll build an array of files that are legal to download chdir($path); $files=glob('*.*'); // next we'll build an array of commonly used content types $mime_types=array(); $mime_types['ai'] ='application/postscript'; $mime_types['asx'] ='video/x-ms-asf'; $mime_types['au'] ='audio/basic'; $mime_types['avi'] ='video/x-msvideo'; $mime_types['bmp'] ='image/bmp'; $mime_types['css'] ='text/css'; $mime_types['doc'] ='application/msword'; $mime_types['eps'] ='application/postscript'; $mime_types['exe'] ='application/octet-stream'; $mime_types['gif'] ='image/gif'; $mime_types['htm'] ='text/html'; $mime_types['html'] ='text/html'; $mime_types['ico'] ='image/x-icon'; $mime_types['jpe'] ='image/jpeg'; $mime_types['jpeg'] ='image/jpeg'; $mime_types['jpg'] ='image/jpeg'; $mime_types['js'] ='application/x-javascript'; $mime_types['mid'] ='audio/mid'; $mime_types['mov'] ='video/quicktime'; $mime_types['mp3'] ='audio/mpeg'; $mime_types['mpeg'] ='video/mpeg'; $mime_types['mpg'] ='video/mpeg'; $mime_types['pdf'] ='application/pdf'; $mime_types['pps'] ='application/vnd.ms-powerpoint'; $mime_types['ppt'] ='application/vnd.ms-powerpoint'; $mime_types['ps'] ='application/postscript'; $mime_types['pub'] ='application/x-mspublisher'; $mime_types['qt'] ='video/quicktime'; $mime_types['rtf'] ='application/rtf'; $mime_types['svg'] ='image/svg+xml'; $mime_types['swf'] ='application/x-shockwave-flash'; $mime_types['tif'] ='image/tiff'; $mime_types['tiff'] ='image/tiff'; $mime_types['txt'] ='text/plain'; $mime_types['wav'] ='audio/x-wav'; $mime_types['wmf'] ='application/x-msmetafile'; $mime_types['xls'] ='application/vnd.ms-excel'; $mime_types['zip'] ='application/zip'; // did we get a parameter telling us what file to download? if(!$_GET['file']){ // if not, create an error message $error='No file specified to download'; }elseif(!in_array($_GET['file'],$files)){ // if the file requested is not in our array of legal // downloads, create an error for that $error='Requested file is not available'; }else{ // otherwise, get the file name and its extension $file=$_GET['file']; $ext=strtolower(substr(strrchr($file,'.'),1)); } // did we get the extension and is it in our array of content types? if($ext && array_key_exists($ext,$mime_types)){ // if so, grab the content type $mime=$mime_types[$ext]; }else{ // otherwise, create an error for that $error=$error?$error:"Invalid MIME type"; } // if we didn't get any errors above if(!$error){ // if the file exists if(file_exists("$file")){ // and the file is readable if(is_readable("$file")){ // get the file size $size=filesize("$file"); // open the file for reading if($fp=@fopen("$file",'r')){ // send the headers header("Content-type: $mime"); // header("Content-Length: $size"); header("Content-Disposition: attachment; filename=\"$file\""); // send the file content fpassthru($fp); // close the file fclose($fp); // and quit exit; } }else{ // file is not readable $error='Cannot read file'; } }else{ // the file does not exist $error='File not found'; } } // if all went well, the exit above will prevent anything below from showing // otherwise, we'll display an error message we created above ?> <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd"> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1"> <title>Download</title> </head> <body> <h1>Download Failed</h1> <?php if($error) print "<p>The error message is: $error</p>\n"; ?> </body> </html> Premiso, Thank you for your response I will take a look at this as well when I get home. Thanks for the replies I appreciate the help big time! Quote Link to comment Share on other sites More sharing options...
xyph Posted July 22, 2011 Share Posted July 22, 2011 @premiso I don't know what that function's doing in the manual, and correct me if I'm wrong but Line 19: $lines = fread($handle, $chunksize); seems to overwrite previous content before it's returned. Theoretically, the function will only return the last chunk. Otherwise, this function still appears to use as much memory as readfile(). You're storing the results in a variable, and then returning the variable once the entire file has been read. You'd be much better off echo'ing the chunk, then using ob_flush() and flush() to force the content to the browser before moving on to the next chunk. Unless you have something stopping the content from being pushed through, the memory space should clear up. I've used this method before, with success. Quote Link to comment Share on other sites More sharing options...
dcro2 Posted July 22, 2011 Share Posted July 22, 2011 I was actually trying a function like that I found, but on lighttpd it still seems to use a lot of memory if the file is big (probably actually because of php-fastcgi), and it never releases it. On apache it seems to work fine though, so here: function readfile_chunked ($filename) { $chunksize = 1*(1024*1024); // how many bytes per chunk $buffer = ''; $handle = fopen($filename, 'rb'); if ($handle === false) { return false; } while (!feof($handle)) { $buffer = fread($handle, $chunksize); print $buffer; ob_flush(); flush(); } return fclose($handle); } Try using that instead of fpassthru(). Quote Link to comment Share on other sites More sharing options...
xyph Posted July 22, 2011 Share Posted July 22, 2011 Yeah, it can use a lot of memory regardless. Compression methods can force an entire page to be buffered before output is sent, and override the flush() and ob_flush() commands. If there's a way to turn those off for certain pages or directories you may be able to avoid this without turning it off entirely. PHP's zlib is also known to cause issues. It's summed up nicely in the user comments here: http://php.net/flush Quote Link to comment Share on other sites More sharing options...
Sayre Posted July 24, 2011 Author Share Posted July 24, 2011 Hi Guys, I appreciate all the help, I've tried adding the chunksize php script you guys gave me it doesn't seem to change anything. I added it to my existing code above however it still only downloads the file and says it's less than a MB. I added a link for you guy to looks too see what happens but it looks like my post got deleted. Thanks anyways, I appreciate it. I will figure it out eventually! Quote Link to comment Share on other sites More sharing options...
Sayre Posted August 4, 2011 Author Share Posted August 4, 2011 Sorry to resurrect this thread however I am still trying to get this working. I have finally figured out and installed mod_xsendfile, however now I've hit another roadblock. I have found the script to use; header("X-Sendfile: $filePath"); header("Content-Disposition: attachment; file=$fileName"); I have the directory of files in the same directory as this script above (download.php) What I can't figure out now is what do I put in the link to the file, and what do I need to add to the script above (download.php) in order to get the correct file name from the link to this file? I hope that was clear enough, Thanks again for you help. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.