nik_jain Posted August 17, 2014 Share Posted August 17, 2014 (edited) In the following code the memory usage balloons up to 500MB while ($priceRow = $pricestimes->fetch_assoc()) { //if another product if ($last_product_id != $priceRow['product_id']){ // 100 products together if ($count % 100 == 0) { echo 'MEM AFTER 100 products- ' . (memory_get_usage(true) / 1024 ). PHP_EOL; //add_price_array_to_db($pricesArr) ; $pricesArr = array(); } $count++; $last_product_id = $priceRow['product_id'] ; } $pricesArr[$priceRow['product_id']][$priceRow['insertion_timestamp']] = $priceRow['price']; } The 'echo MEM AFTER 100 products-' line shows that the memory of the script keeps on increasing. Any ideas why there is a memory leak here ? The $pricestimes is a huge mysqli resultset with 23 million rows, so I suspect that may be the issue here. Unless there is something wrong with the script logic. btw the 'add_price_array_to_db($pricesArr)' is uncommented in the real code. But even with this line commented out the memory leak is there. Thanks Edited August 17, 2014 by nik_jain Quote Link to comment https://forums.phpfreaks.com/topic/290500-memory-use-keeps-increasing-in-loop/ Share on other sites More sharing options...
kicken Posted August 17, 2014 Share Posted August 17, 2014 You could add a call to gc_collect_cycles after resetting your $pricesArr variable. Unless your running into some problems due to the memory usage though I wouldn't worry about it that much. Processing a huge result set is going to require some memory. PHP does not always run garbage collection immediately, it will wait until it deems it necessary. Quote Link to comment https://forums.phpfreaks.com/topic/290500-memory-use-keeps-increasing-in-loop/#findComment-1488056 Share on other sites More sharing options...
CroNiX Posted August 17, 2014 Share Posted August 17, 2014 You can also try doing it in cycles of 1000 records at a time instead of all at once and then free the result after each cycle. Quote Link to comment https://forums.phpfreaks.com/topic/290500-memory-use-keeps-increasing-in-loop/#findComment-1488066 Share on other sites More sharing options...
mogosselin Posted August 17, 2014 Share Posted August 17, 2014 Why do you need to process so many rows at the same time? To me, it looks like the wrong solution and it doesn't seems to scale really well. Do you need to process the rows one by one for a report or something like that? Quote Link to comment https://forums.phpfreaks.com/topic/290500-memory-use-keeps-increasing-in-loop/#findComment-1488068 Share on other sites More sharing options...
Solution nik_jain Posted August 18, 2014 Author Solution Share Posted August 18, 2014 (edited) I FIGURED IT OUT!! I stumbled upon this article: http://php.net/manual/en/mysqlinfo.concepts.buffering.php , which was exactly what I needed. Basically when the $pricetimes resultset is created, I had to make sure to use MYSQLI_USE_RESULT (for unbuffered mode in mysqli). This made the fetch_assoc() NOT keep the results in memory and keeps the memory use down to 50MB or so. The downside of this is that one cannot data_seek or get a count of the rows. But as I didn't require either this solution is perfect! Edited August 18, 2014 by nik_jain Quote Link to comment https://forums.phpfreaks.com/topic/290500-memory-use-keeps-increasing-in-loop/#findComment-1488104 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.