iCeR Posted December 31, 2010 Share Posted December 31, 2010 My current code (see below) uses 147MB of virtual memory! My provider has allocated 100MB by default and the process is killed once run, causing an internal error. The code is utilising curl multi and must be able to loop with more than 150 iterations whilst still minimizing the virtual memory. The code below is only set at 150 iterations and still causes the internal server error. At 90 iterations the issue does not occur. How can I adjust my code to lower the resource use / virtual memory whilst still maintaining lightning fast speed for executing the URL and receiving the results? Alternatively, Is there an example where I am able to do overlapping HTTPS requests (rather than the below) while getting the results as they arrive. Language which supports threads to do this? Thanks! <?php function udate($format, $utimestamp = null) { if ($utimestamp === null) $utimestamp = microtime(true); $timestamp = floor($utimestamp); $milliseconds = round(($utimestamp - $timestamp) * 1000); return date(preg_replace('`(?<!\\\\)u`', $milliseconds, $format), $timestamp); } $url = 'https://www.testdomain.com/'; $curl_arr = array(); $master = curl_multi_init(); for($i=0; $i<150; $i++) { $curl_arr[$i] = curl_init(); curl_setopt($curl_arr[$i], CURLOPT_URL, $url); curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, 1); curl_setopt($curl_arr[$i], CURLOPT_SSL_VERIFYHOST, FALSE); curl_setopt($curl_arr[$i], CURLOPT_SSL_VERIFYPEER, FALSE); curl_multi_add_handle($master, $curl_arr[$i]); } do { curl_multi_exec($master,$running); } while($running > 0); for($i=0; $i<150; $i++) { $results = curl_multi_getcontent ($curl_arr[$i]); $results = explode("<br>", $results); echo $results[0]; echo "<br>"; echo $results[1]; echo "<br>"; echo udate('H:i:s:u'); echo "<br><br>"; usleep(100000); } ?> Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/ Share on other sites More sharing options...
QuickOldCar Posted December 31, 2010 Share Posted December 31, 2010 I tried this out. You never close the curl connection. It's doing the exact same url's and displaying their entire pages numerous times. By doing it this way for whatever reason you may have, I'm not sure what trying to accomplish here, but their links won't work when they start with ./ or ../, the link would be relative to your own site. Look into http://simplehtmldom.sourceforge.net/ but can still use curl to resolve the url locations and error handling. I use single curl requests and save url's to text files, I grab the top url, delete from the text file and curl to the site, grab the information if alive, then do a meta refresh and onto the next url in the list. place this end of php code, it'll execute when the php completes echo('<meta http-equiv="refresh" content="1">'); Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153385 Share on other sites More sharing options...
iCeR Posted December 31, 2010 Author Share Posted December 31, 2010 Thanks QuickOldCar, even closing curl for $curl_arr[$i] at the end of the 2nd 'for' loop, makes no difference. Basically I want to; Only use 1 URL but multiple times (reason being it does continuous checks and responds back with data, the URL is for an API using https://) Minimize the time between requests (using keepalive/parallel threads to perform the load URL and output within 100ms of the next) Minimize memory usage Do you have any example code which will fit into my code in my original post? Any help would be greatly appreciated! Thanks Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153401 Share on other sites More sharing options...
QuickOldCar Posted December 31, 2010 Share Posted December 31, 2010 (reason being it does continuous checks and responds back with data, the URL is for an API using https://) What type of checks, if alive, response code, maybe something different in data. Then whats it need to send back to you. I just see no point in displaying output to entire pages a hundred times, and that's your main memory hog. Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153405 Share on other sites More sharing options...
iCeR Posted December 31, 2010 Author Share Posted December 31, 2010 The URL is connected to a domain API and it checks the availability of ONE domain I need it to continuously check if available and once available it will do something else.. Just trying to sort minimizing memory usage on the availability checks first. Each query will ouput 3 lines: available, not available and whois failure. It isn't actually displaying a full website. Hope this makes things a little clearer. Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153411 Share on other sites More sharing options...
kenrbnsn Posted December 31, 2010 Share Posted December 31, 2010 In this section, <?php $results = curl_multi_getcontent ($curl_arr[$i]); $results = explode("<br>", $results); echo $results[0]; echo "<br>"; echo $results[1]; echo "<br>"; echo udate('H:i:s:u'); echo "<br><br>"; usleep(100000); ?> Try this instead <?php $list($first,$second) = explode('<br>',curl_multi_getcontent ($curl_arr[$i])); echo $first; echo "<br>"; echo $second; echo "<br>"; echo udate('H:i:s:u'); echo "<br><br>"; usleep(100000); unset $first, $second; ?> Ken Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153421 Share on other sites More sharing options...
iCeR Posted January 1, 2011 Author Share Posted January 1, 2011 Try this instead <?php $list($first,$second) = explode('<br>',curl_multi_getcontent ($curl_arr[$i])); echo $first; echo "<br>"; echo $second; echo "<br>"; echo udate('H:i:s:u'); echo "<br><br>"; usleep(100000); unset $first, $second; ?> Ken Thanks Ken. It comes up with this error; Fatal error: Can't use function return value in write context in.... How can I resolve? Much appreciated! Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153532 Share on other sites More sharing options...
iCeR Posted January 1, 2011 Author Share Posted January 1, 2011 Ok just a few things I had to sort with code, marked in comments. However it is still exceeding the 100mb virtual memory limit, causing an internal error and the script to stop before running any queries. Any other ideas? Thank you. list($first,$second) = explode('<br>',curl_multi_getcontent ($curl_arr[$i])); //list not a variable echo "<br>"; echo $second; echo "<br>"; echo udate('H:i:s:u'); echo "<br><br>"; usleep(100000); unset ($first, $second); //array in brackets Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153552 Share on other sites More sharing options...
kenrbnsn Posted January 1, 2011 Share Posted January 1, 2011 I think you have to rethink how you're doing this. PHP isn't well suited for how you're trying to do this. A better way would be for another script that would be run as a cron job. This script would update a database record or a file. The cronjob would run once each time, so there's no worry about using too much memory. The script to display the results would read the database or file. Ken Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153581 Share on other sites More sharing options...
iCeR Posted January 2, 2011 Author Share Posted January 2, 2011 Thanks Ken. I can setup a cron and db, but what do you mean about not using PHP and output to an external file? If you have clearer step-by-step instructions I can go ahead and be on my way Very very much appreciated!! Thanks Quote Link to comment https://forums.phpfreaks.com/topic/223057-how-to-reduce-virtual-memory-by-optimising-my-php-code/#findComment-1153792 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.