Jump to content

How to reduce virtual memory by optimising my PHP code?


iCeR

Recommended Posts

My current code (see below) uses 147MB of virtual memory! My provider has allocated 100MB by default and the process is killed once run, causing an internal error. The code is utilising curl multi and must be able to loop with more than 150 iterations whilst still minimizing the virtual memory. The code below is only set at 150 iterations and still causes the internal server error. At 90 iterations the issue does not occur.

 

How can I adjust my code to lower the resource use / virtual memory whilst still maintaining lightning fast speed for executing the URL and receiving the results?

 

Alternatively, Is there an example where I am able to do overlapping HTTPS requests (rather than the below) while getting the results as they arrive. Language which supports threads to do this?

 

Thanks!

 

<?php

    function udate($format, $utimestamp = null) {
      if ($utimestamp === null)
        $utimestamp = microtime(true);
      $timestamp = floor($utimestamp);
      $milliseconds = round(($utimestamp - $timestamp) * 1000);
      return date(preg_replace('`(?<!\\\\)u`', $milliseconds, $format), $timestamp);
    }

$url = 'https://www.testdomain.com/';
$curl_arr = array();
$master = curl_multi_init();

for($i=0; $i<150; $i++)
{
    $curl_arr[$i] = curl_init();
    curl_setopt($curl_arr[$i], CURLOPT_URL, $url);
    curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($curl_arr[$i], CURLOPT_SSL_VERIFYHOST, FALSE);
    curl_setopt($curl_arr[$i], CURLOPT_SSL_VERIFYPEER, FALSE);
    curl_multi_add_handle($master, $curl_arr[$i]);
}

do {
    curl_multi_exec($master,$running);
} while($running > 0);

for($i=0; $i<150; $i++)
{
    $results = curl_multi_getcontent ($curl_arr[$i]);
    $results = explode("<br>", $results);
      echo $results[0];
      echo "<br>";
      echo $results[1];
      echo "<br>";
      echo udate('H:i:s:u');
      echo "<br><br>";
      usleep(100000);
}

?>

 

 

Link to comment
Share on other sites

I tried this out.

 

You never close the curl connection.

It's doing the exact same url's and displaying their entire pages numerous times.

By doing it this way for whatever reason you may have, I'm not sure what trying to accomplish here, but their links won't work when they start with ./ or ../, the link would be relative to your own site.

 

Look into http://simplehtmldom.sourceforge.net/

but can still use curl to resolve the url locations and error handling.

 

I use single curl requests and save url's to text files, I grab the top url, delete from the text file and curl to the site, grab the information if alive, then do a meta refresh and onto the next url in the list.

 

place this end of php code, it'll execute when the php completes

echo('<meta http-equiv="refresh" content="1">');

Link to comment
Share on other sites

Thanks QuickOldCar, even closing curl for $curl_arr[$i] at the end of the 2nd 'for' loop, makes no difference.

Basically I want to;

 

  • Only use 1 URL but multiple times (reason being it does continuous checks and responds back with data, the URL is for an API using https://)
    Minimize the time between requests (using keepalive/parallel threads to perform the load URL and output within 100ms of the next)
    Minimize memory usage

 

 

Do you have any example code which will fit into my code in my original post?

Any help would be greatly appreciated! Thanks

Link to comment
Share on other sites

(reason being it does continuous checks and responds back with data, the URL is for an API using https://)

 

What type of checks, if alive, response code, maybe something different in data.

Then whats it need to send back to you.

 

I just see no point in displaying output to entire pages a hundred times, and that's your main memory hog.

Link to comment
Share on other sites

The URL is connected to a domain API and it checks the availability of ONE domain :)

I need it to continuously check if available and once available it will do something else..

Just trying to sort minimizing memory usage on the availability checks first.

 

Each query will ouput 3 lines: available, not available and whois failure.

It isn't actually displaying a full website.

 

Hope this makes things a little clearer.

Link to comment
Share on other sites

In  this section,

<?php

    $results = curl_multi_getcontent ($curl_arr[$i]);
    $results = explode("<br>", $results);
      echo $results[0];
      echo "<br>";
      echo $results[1];
      echo "<br>";
      echo udate('H:i:s:u');
      echo "<br><br>";
      usleep(100000);
?>

Try this instead

<?php

    $list($first,$second) = explode('<br>',curl_multi_getcontent ($curl_arr[$i]));
      echo $first;
      echo "<br>";
      echo $second;
      echo "<br>";
      echo udate('H:i:s:u');
      echo "<br><br>";
      usleep(100000);
      unset $first, $second;
?>

 

Ken

Link to comment
Share on other sites

Try this instead

<?php

    $list($first,$second) = explode('<br>',curl_multi_getcontent ($curl_arr[$i]));
      echo $first;
      echo "<br>";
      echo $second;
      echo "<br>";
      echo udate('H:i:s:u');
      echo "<br><br>";
      usleep(100000);
      unset $first, $second;
?>

 

Ken

 

Thanks Ken. It comes up with this error;

 

Fatal error: Can't use function return value in write context in....

 

How can I resolve?

Much appreciated!

Link to comment
Share on other sites

Ok just a few things I had to sort with code, marked in comments.

However it is still exceeding the 100mb virtual memory limit, causing an internal error and the script to stop before running any queries.

Any other ideas? Thank you.

 

     list($first,$second) = explode('<br>',curl_multi_getcontent ($curl_arr[$i])); //list not a variable
      echo "<br>";
      echo $second;
      echo "<br>";
      echo udate('H:i:s:u');
      echo "<br><br>";
      usleep(100000);
      unset ($first, $second); //array in brackets

Link to comment
Share on other sites

I think you have to rethink how you're doing this. PHP isn't well suited for how you're trying to do this.

 

A better way would be for another script that would be run as a cron job. This script would update a database record or a file. The cronjob would run once each time, so there's no worry about using too much memory. The script to display the results would read the database or file.

 

Ken

Link to comment
Share on other sites

Thanks Ken. I can setup a cron and db, but what do you mean about not using PHP and output to an external file?

If you have clearer step-by-step instructions I can go ahead and be on my way :)

Very very much appreciated!!

 

Thanks

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.