Jump to content

Fopen + Fgets + File Sizes ???


rckehoe

Recommended Posts

I have a script that I am trying to transfer one file from a remote server to my local server and save it. The files can be quite large. I have successfully backed up around 200 MB... But, anything over that seems to fail with a strange error... I was hoping for a little guidance... here is my code and error/warning that I get:

 

PHP Code:

function remote_capture($tmp_url, $filename)

{

 

$r_handle = fopen($tmp_url, "rb");

$d_handle = fopen($filename, 'w');

 

if($r_handle&&$d_handle)

{

    while(($buffer = fgets($r_handle)) !== false)

    {

fputs($d_handle, $buffer);

    }

   

    fclose($r_handle);

fclose($d_handle);

 

return true;

 

}

else

{

return false;

}

 

}

 

Warning Message:

Warning: file_get_contents(URL_OF_MY_SCRIPT_HAS_BEEN_REMOTED) [function.file-get-contents]: failed to open stream: HTTP request failed! in /path/on/my/local/server/to/script on line 298

Link to comment
https://forums.phpfreaks.com/topic/228986-fopen-fgets-file-sizes/
Share on other sites

Hi rckehoe,

If this is only happening with large files it could be due to a timeout error. file_get_contents may not have time to completely read the file before the timeout limit is exceeded. You could try increasing the timeout value in your php.ini file if you have access to it.

 

You could also check if it is a timeout error by only reading a small amount of the file. I'm not sure if this will work with binary data though.

 

$url = 'http://somesite.com/myfile.txt';
$file = fopen($url,  "r"); 
          $contents = fgets($file, 100000);  //Limits to 100KB 
fclose($file); 

 

If that gets part of the file then I reckon it is a timeout error.

 

Cheers,

Fergal

i appreciate the suggestion... but i am using set_time_limit(0); and ini_set('memory_limit', '-1'); already....  i should have mentioned that. my script works fantastic for long periods of time and downloads large files... problem is, i need to download even larger files. hoping for files up in the 1Gb range at least.

 

i tried your code to get a portion of the file and it did not work... i stll got the same error. any thoughts?

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.