Jump to content


This topic is now archived and is closed to further replies.


Fatal Error: Allowed memory size...exhausted

Recommended Posts

I've written a script that processes multiple large files one by one for insertion into a MySQL database.  After  processing a number of files, I get this error:

[code]Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 35 bytes)[/code]

Now, I know the typical solution to this is to chance the php memory limit, but as you can see, the limit is already at 32mb, and potentially this script could deal with directories whose contents reach into the gigabytes.  The key here though, is that the individual size of each file is quite small (under a megabyte).  I think is simply the summation of all of them that pushes it over the limit as no single file approaches 32mb.

I've been looking into unset(), but it doesn't seem to be having much of an effect, though perhaps I am just using it in the wrong places.

Thanks for any suggestions!

Share this post

Link to post
Share on other sites
Please post some of your code. There may be some optimizations that aren't apparent that can be made.


Share this post

Link to post
Share on other sites
Is PHP running in CGI mode, versus running as an Apache module? Is this a Windows server? There are also a number of things you can do in terms of optimization. A sample of your code would help.

Share this post

Link to post
Share on other sites
Thanks for the responses gentleman:

I think I've hit on the problem--simply said, I spoke too soon.  One or two of the files are as large as 4.7mb, just a couple out of hundreds.  I read their contents into a multi-dimensional array (which php does not have native support for), which balloons (I'm conjecturing here) the arrays size to many times the filesize. 

That is, a multi-dimensional array of a file's contents takes up many times more space than the file itself (should be obvious I suppose).

Of course PHP memory managment is smart enough not to store arrays around that have already been used.  It was one massive array that bombed the whole thing.

Of course, the best way to do it would be to not read in the entire file at once.  This is possible, and I in fact wrote a different script that did just that.  As I am searching for certain minimum and maximum values within the files, I simply did so on a line by line basis.

I ran into some problems.  First, the file is delimited by carriage returns, so i was forced to use stream_get_line(), which, as of PHP 5.0.4 at least, does not work as documented.  So then I was forced to use fgetc.  If you can imagine, that took so long that simple reading one file took over 30 seconds.

I'll post the entirety of the code below (both options).  If anyone has suggestions, I'd be glad to hear them, though I have temporarily got over the hump by bumping the memory_limit up to a whopping 64mb.

Here is what I am currently using:
function getfile_whole() {
    //Grab the entire file in a single string
    $file_str = stream_get_contents($this->fs);
    if(strlen($file_str) == 0)
      throw new Exception("$this->fname: filesize of zero");
    //Strip lagging carriage return(to avoid putting in a trailing empty element)
    $file_str = rtrim($file_str);
    //Replace all carriage returns with delimiting commas
    $file_str = strtr($file_str,"\r", ",");
    //Use delimited commas into an array.
    $full_file_arr = explode(",", $file_str);
    unset($file_str);  //free memory
    if(sizeof($full_file_arr) == 0)
      throw new Exception("$this->fname: data parse failed");

    //re-explode into a nested array
    $i = 0;
    $success = false;
    foreach($full_file_arr as $row) {
      //Omit rows with undecipherable data
      if(!eregi('[:graph:]', $row)) {
$temp = explode(" ", $row);
//Omit rows where the lat or long is zero
if($temp[1] != 0 && $temp[2] != 0) {
  $file_arr[$i] = $temp;
  $success = true;  //to ensure that at least one element has been added

Here is a variant of what I probably SHOULD be using (with some replacement to halt on carriage returns):
function getfile_line() {
    $first = true;              //allow for first-loop only events
    while(!feof($this->fs)) {
    $line = "";                  //define $line as empty
      //retrieve a single line
      while(($c = fgetc($this->fs)) !== "\r" && $c !== false) {
$line .= $c;
      //Parse into an array
      $line = explode(" ", $line);
      //Check to make sure data line matches accepted format
      if(sizeof($line) !== 7) {
throw new Exception("File $this->fname contains unknown data.");
      //Assign starting values to variables
      if($first) {
//Match exploded array values to their appropriate variables
$start_time = $line[0];
$finish_time = $line[0];
$start_lat = $line[1];
$finish_lat = $line[1];
$start_long = $line[2];
$finish_long = $line[2];
$first = false;
      //Find the minimum and maximum variable values
      if($line[0] < $start_time){ $start_time = $line[0]; }
      if($line[0] > $finish_time){ $finish_time = $line[0]; }
      if($line[1] < $start_lat){ $start_lat = $line[1]; }
      if($line[1] > $finish_lat){ $finish_lat = $line[1]; }
      if($line[2] < $start_long){ $start_long = $line[2]; }
      if($line[2] > $finish_long){ $finish_long = $line[2]; }


Glad to satisfy anyone's curiosity.

Whoops, didn't even answer the questions.  Apache Server/Unix

Share this post

Link to post
Share on other sites
What do you mean by this:[quote]a multi-dimensional array (which php does not have native support for)[/quote]
PHP can do multidementional arrays fine. I use the all the time.


Share this post

Link to post
Share on other sites
Apologies, you are absolutely right, PHP does support multidimensional arrays (I too use them). 

I think they use somewhat more memory than regular arrays, though I am not sure, so possibly could make the memory issue worse.  I don't know what it means or if it is significant, just read it off've this bug report (only reference I could find) [url=http://bugs.php.net/bug.php?id=13598]http://bugs.php.net/bug.php?id=13598[/url], though it may be quite outdated.

Share this post

Link to post
Share on other sites


Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.