sammermpc Posted July 27, 2006 Share Posted July 27, 2006 I've written a script that processes multiple large files one by one for insertion into a MySQL database. After processing a number of files, I get this error:[code]Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 35 bytes)[/code]Now, I know the typical solution to this is to chance the php memory limit, but as you can see, the limit is already at 32mb, and potentially this script could deal with directories whose contents reach into the gigabytes. The key here though, is that the individual size of each file is quite small (under a megabyte). I think is simply the summation of all of them that pushes it over the limit as no single file approaches 32mb.I've been looking into unset(), but it doesn't seem to be having much of an effect, though perhaps I am just using it in the wrong places.Thanks for any suggestions! Quote Link to comment https://forums.phpfreaks.com/topic/15842-fatal-error-allowed-memory-sizeexhausted/ Share on other sites More sharing options...
kenrbnsn Posted July 27, 2006 Share Posted July 27, 2006 Please post some of your code. There may be some optimizations that aren't apparent that can be made.Ken Quote Link to comment https://forums.phpfreaks.com/topic/15842-fatal-error-allowed-memory-sizeexhausted/#findComment-64894 Share on other sites More sharing options...
Caesar Posted July 27, 2006 Share Posted July 27, 2006 Is PHP running in CGI mode, versus running as an Apache module? Is this a Windows server? There are also a number of things you can do in terms of optimization. A sample of your code would help. Quote Link to comment https://forums.phpfreaks.com/topic/15842-fatal-error-allowed-memory-sizeexhausted/#findComment-64895 Share on other sites More sharing options...
sammermpc Posted July 28, 2006 Author Share Posted July 28, 2006 Thanks for the responses gentleman:I think I've hit on the problem--simply said, I spoke too soon. One or two of the files are as large as 4.7mb, just a couple out of hundreds. I read their contents into a multi-dimensional array (which php does not have native support for), which balloons (I'm conjecturing here) the arrays size to many times the filesize. That is, a multi-dimensional array of a file's contents takes up many times more space than the file itself (should be obvious I suppose). Of course PHP memory managment is smart enough not to store arrays around that have already been used. It was one massive array that bombed the whole thing.Of course, the best way to do it would be to not read in the entire file at once. This is possible, and I in fact wrote a different script that did just that. As I am searching for certain minimum and maximum values within the files, I simply did so on a line by line basis.I ran into some problems. First, the file is delimited by carriage returns, so i was forced to use stream_get_line(), which, as of PHP 5.0.4 at least, does not work as documented. So then I was forced to use fgetc. If you can imagine, that took so long that simple reading one file took over 30 seconds.I'll post the entirety of the code below (both options). If anyone has suggestions, I'd be glad to hear them, though I have temporarily got over the hump by bumping the memory_limit up to a whopping 64mb.Here is what I am currently using:[code]function getfile_whole() { //Grab the entire file in a single string $file_str = stream_get_contents($this->fs); if(strlen($file_str) == 0) throw new Exception("$this->fname: filesize of zero"); //Strip lagging carriage return(to avoid putting in a trailing empty element) $file_str = rtrim($file_str); //Replace all carriage returns with delimiting commas $file_str = strtr($file_str,"\r", ","); //Use delimited commas into an array. $full_file_arr = explode(",", $file_str); unset($file_str); //free memory if(sizeof($full_file_arr) == 0) throw new Exception("$this->fname: data parse failed"); //re-explode into a nested array $i = 0; $success = false; foreach($full_file_arr as $row) { //Omit rows with undecipherable data if(!eregi('[:graph:]', $row)) { $temp = explode(" ", $row); //Omit rows where the lat or long is zero if($temp[1] != 0 && $temp[2] != 0) { $file_arr[$i] = $temp; $i++; $success = true; //to ensure that at least one element has been added }[/code]Here is a variant of what I probably SHOULD be using (with some replacement to halt on carriage returns):[code]function getfile_line() { $first = true; //allow for first-loop only events while(!feof($this->fs)) { $line = ""; //define $line as empty //retrieve a single line while(($c = fgetc($this->fs)) !== "\r" && $c !== false) { $line .= $c; } //Parse into an array $line = explode(" ", $line); //Check to make sure data line matches accepted format if(sizeof($line) !== 7) { throw new Exception("File $this->fname contains unknown data."); } //Assign starting values to variables if($first) { //Match exploded array values to their appropriate variables $start_time = $line[0]; $finish_time = $line[0]; $start_lat = $line[1]; $finish_lat = $line[1]; $start_long = $line[2]; $finish_long = $line[2]; $first = false; } //Find the minimum and maximum variable values if($line[0] < $start_time){ $start_time = $line[0]; } if($line[0] > $finish_time){ $finish_time = $line[0]; } if($line[1] < $start_lat){ $start_lat = $line[1]; } if($line[1] > $finish_lat){ $finish_lat = $line[1]; } if($line[2] < $start_long){ $start_long = $line[2]; } if($line[2] > $finish_long){ $finish_long = $line[2]; } }[/code]Glad to satisfy anyone's curiosity.Whoops, didn't even answer the questions. Apache Server/Unix Quote Link to comment https://forums.phpfreaks.com/topic/15842-fatal-error-allowed-memory-sizeexhausted/#findComment-64907 Share on other sites More sharing options...
kenrbnsn Posted July 28, 2006 Share Posted July 28, 2006 What do you mean by this:[quote]a multi-dimensional array (which php does not have native support for)[/quote]PHP can do multidementional arrays fine. I use the all the time.Ken Quote Link to comment https://forums.phpfreaks.com/topic/15842-fatal-error-allowed-memory-sizeexhausted/#findComment-64917 Share on other sites More sharing options...
sammermpc Posted July 28, 2006 Author Share Posted July 28, 2006 Apologies, you are absolutely right, PHP does support multidimensional arrays (I too use them). I think they use somewhat more memory than regular arrays, though I am not sure, so possibly could make the memory issue worse. I don't know what it means or if it is significant, just read it off've this bug report (only reference I could find) [url=http://bugs.php.net/bug.php?id=13598]http://bugs.php.net/bug.php?id=13598[/url], though it may be quite outdated. Quote Link to comment https://forums.phpfreaks.com/topic/15842-fatal-error-allowed-memory-sizeexhausted/#findComment-64922 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.