Jump to content


This topic is now archived and is closed to further replies.


Question on files

Recommended Posts

I am using PHP to make a website related to biology. In this site I use PHP code to communicate with external programms (not in PHP, but some biological packages). These programms write their output in temporary files, which I then read using fread() or file_get_contents() functions.
My problem is that these functions seem to have a restriction as to the size of the file that they can read. Is there any other way I read big files (for files close to 20-30MBs for example)?
Please note that there is no other way for me to do this because these programs, by default, write their output to a text file, so I can't use PHP's system commands and read the output on the fly. The output file is first created and then I read it into a string and parse it according to what I need.

Share this post

Link to post
Share on other sites
Not 100% certain as I haven't tested it but you might look at this thread

[a href=\"http://www.trap17.com/index.php/how-read-large-files-php_t28289.html\" target=\"_blank\"]http://www.trap17.com/index.php/how-read-l...php_t28289.html[/a]

[a href=\"http://archives.postgresql.org/pgsql-sql/2003-06/msg00213.php\" target=\"_blank\"]http://archives.postgresql.org/pgsql-sql/2...06/msg00213.php[/a]
[!--quoteo--][div class=\'quotetop\']QUOTE[/div][div class=\'quotemain\'][!--quotec--]Generally you're gonna hit a limit with the max memory size set in the
php.ini file, which defaults to 8 meg max process size before abort. You
can crank this up so high your machine starts to swap out.

I'd guess PHP is likely limited to some number internally, but I've messed
around with a hundred or so megs before.[/quote]


Share this post

Link to post
Share on other sites


Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.