Jump to content

Memory use keeps increasing in loop


nik_jain

Recommended Posts

In the following code the memory usage balloons up to 500MB

       while ($priceRow = $pricestimes->fetch_assoc()) {

            //if another product
            if ($last_product_id != $priceRow['product_id']){
                // 100 products together
                if ($count % 100 == 0) {
                    echo 'MEM AFTER 100 products- ' . (memory_get_usage(true) / 1024 ). PHP_EOL;
                    //add_price_array_to_db($pricesArr) ;
                    $pricesArr = array();
                }
                $count++;
                $last_product_id = $priceRow['product_id'] ;
            }
            $pricesArr[$priceRow['product_id']][$priceRow['insertion_timestamp']] = $priceRow['price'];

        }

The 'echo MEM AFTER 100 products-' line shows that the memory of the script keeps on increasing.

Any ideas why there is a memory leak here ? The $pricestimes is a huge mysqli resultset with 23 million rows, so I suspect that may be the issue here. Unless there is something wrong with the script logic.

 

btw the 'add_price_array_to_db($pricesArr)' is uncommented in the real code. But even with this line commented out the memory leak is there.

 

Thanks

Link to comment
https://forums.phpfreaks.com/topic/290500-memory-use-keeps-increasing-in-loop/
Share on other sites

You could add a call to gc_collect_cycles after resetting your $pricesArr variable. Unless your running into some problems due to the memory usage though I wouldn't worry about it that much. Processing a huge result set is going to require some memory. PHP does not always run garbage collection immediately, it will wait until it deems it necessary.

I FIGURED IT OUT!!

 

I stumbled upon this article: http://php.net/manual/en/mysqlinfo.concepts.buffering.php  , which was exactly what I needed.

 

Basically when the $pricetimes resultset is created, I had to make sure to use MYSQLI_USE_RESULT (for unbuffered mode in mysqli). This made the fetch_assoc() NOT keep the results in memory and keeps the memory use down to 50MB or so.

 

The downside of this is that one cannot data_seek or get a count of the rows. But as I didn't require either this solution is perfect!

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.