moola Posted February 9, 2007 Share Posted February 9, 2007 Whats the best way to run a memory intensive php script. I have a long script that will take approximately 2 mins to execute. Is that a way to embed php code in javascript or something that will keep the webpage up whilst doing background work, waiting for the result. Thanks Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/ Share on other sites More sharing options...
Psycho Posted February 9, 2007 Share Posted February 9, 2007 Use a combination of Javascript and PHP: AJAX. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180264 Share on other sites More sharing options...
Attilitus Posted February 9, 2007 Share Posted February 9, 2007 Also be aware of PHP timeouts which are normally set around 30 seconds. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180268 Share on other sites More sharing options...
hvle Posted February 9, 2007 Share Posted February 9, 2007 You should optimize so it take 2 seconds instead. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180282 Share on other sites More sharing options...
artacus Posted February 9, 2007 Share Posted February 9, 2007 You should optimize so it take 2 seconds instead. LOL. Good, I've got some work for you. I guess it depends on who your audience is. If its just a maintenence script that you'll be running, then just do it. If its something other people will need to run, you should make a progress indicator so they can see its working and don't keep hitting the submit button or refreshing the page. For really long scripts, I'll break it up into manageable chunks and when one page completes, it calls the next. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180290 Share on other sites More sharing options...
moola Posted February 9, 2007 Author Share Posted February 9, 2007 I've tried the settings in php.ini and set_time_limit(0 or more)... The script still times out on the browser. (im not sure if time out is the real problem, but thats what i'm calling it) The code needs to check over 1000 urls. I have a progress meter embedded in the script, which I do not show here, but the problem is that the progress meter only shows up after the script is done executing. In other words after you submit data to the page, it shows as if it is loading the page. After about 30 seconds the script is done and shows the progress at about 20% completed. It never runs to full completion. Does anyone have a clue how to embed this in ajax, or make it run to completion? function remote_file_size ($url){ $head = ""; $url_p = parse_url($url); $host = $url_p["host"]; if(!preg_match("/[0-9]*\.[0-9]*\.[0-9]*\.[0-9]*/",$host)){ // a domain name was given, not an IP $ip=gethostbyname($host); if(!preg_match("/[0-9]*\.[0-9]*\.[0-9]*\.[0-9]*/",$ip)){ //domain could not be resolved return -1; } } $port = intval($url_p["port"]); if(!$port) $port=80; $path = $url_p["path"]; //echo "Getting " . $host . ":" . $port . $path . " ..."; $fp = fsockopen($host, $port, $errno, $errstr, 20); if(!$fp) { return false; } else { fputs($fp, "HEAD " . $url . " HTTP/1.1\r\n"); fputs($fp, "HOST: " . $host . "\r\n"); fputs($fp, "User-Agent: http://www.example.com/my_application\r\n"); fputs($fp, "Connection: close\r\n\r\n"); $headers = ""; while (!feof($fp)) { $headers .= fgets ($fp, 128); } } fclose ($fp); //echo $errno .": " . $errstr . "<br />"; $return = -2; $arr_headers = explode("\n", $headers); // echo "HTTP headers for <a href='" . $url . "'>..." . substr($url,strlen($url)-20). "</a>:"; // echo "<div class='http_headers'>"; foreach($arr_headers as $header) { // if (trim($header)) echo trim($header) . "<br />"; $s1 = "HTTP/1.1"; $s2 = "Content-Length: "; $s3 = "Location: "; if(substr(strtolower ($header), 0, strlen($s1)) == strtolower($s1)) $status = substr($header, strlen($s1)); if(substr(strtolower ($header), 0, strlen($s2)) == strtolower($s2)) $size = substr($header, strlen($s2)); if(substr(strtolower ($header), 0, strlen($s3)) == strtolower($s3)) $newurl = substr($header, strlen($s3)); } // echo "</div>"; if(intval($size) > 0) { $return=intval($size); } else { $return=$status; } // echo intval($status) .": [" . $newurl . "]<br />"; if (intval($status)==302 && strlen($newurl) > 0) { // 302 redirect: get HTTP HEAD of new URL $return=remote_file_size($newurl); } return $return; } for ($x = 0; $x < 200; $x++){ $url = a url link to video file; $filesize = remote_file_size($url); } Note I've tried two different remote_file_size ($url) functions, and I still get the same problem. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180378 Share on other sites More sharing options...
artacus Posted February 9, 2007 Share Posted February 9, 2007 A progress meter that only shows up at 100% is not much use is it? Do like I said and check 100 at a time (or whatever number ends up working). Just like you where paging thru db results. Use offset and limit so the next call checks the next 100 urls. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180385 Share on other sites More sharing options...
Jessica Posted February 9, 2007 Share Posted February 9, 2007 I think artacus's way is best. You can have the script redirect to itself passing the next number to start at, so it's constantly refreshing with progress. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180387 Share on other sites More sharing options...
moola Posted February 9, 2007 Author Share Posted February 9, 2007 Hope this help? for ($x = 0; $x < 200; $x++){ $progress->tickUpdate("Tick: $x"); $servers = array( "$url$x", "$url2$x", "$url3$x", "$url4$x", "$url5$x" ); $mode = current($servers); $filesize = remote_file_size($mode); if ($filesize($url1$x)>0){ do this exit } if ($filesize($url2$x)>0){ do this exit } .............. } // Shutdown progress and display output $progress->stop(); I actually want to see my progress meter in action. The point is it never shows up when the script is executing. It only shows up after the script is done. AND the script always ends prematurely. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180400 Share on other sites More sharing options...
artacus Posted February 9, 2007 Share Posted February 9, 2007 I think artacus's way is best. Finally! A minion to help in my schemes for total World domination! Oh, sorry, were you talking about the php?... never mind. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180403 Share on other sites More sharing options...
moola Posted February 9, 2007 Author Share Posted February 9, 2007 huh? Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180408 Share on other sites More sharing options...
hvle Posted February 9, 2007 Share Posted February 9, 2007 huh? don't mind him, he like to talk to himself. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180410 Share on other sites More sharing options...
moola Posted February 9, 2007 Author Share Posted February 9, 2007 lol ok. But... how about my question ??? hahaha Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180411 Share on other sites More sharing options...
hvle Posted February 9, 2007 Share Posted February 9, 2007 I've tried the settings in php.ini and set_time_limit(0 or more)... The script still times out on the browser. (im not sure if time out is the real problem, but thats what i'm calling it) The code needs to check over 1000 urls. I have a progress meter embedded in the script, which I do not show here, but the problem is that the progress meter only shows up after the script is done executing. In other words after you submit data to the page, it shows as if it is loading the page. After about 30 seconds the script is done and shows the progress at about 20% completed. It never runs to full completion. Does anyone have a clue how to embed this in ajax, or make it run to completion? function remote_file_size ($url){ $head = ""; $url_p = parse_url($url); $host = $url_p["host"]; if(!preg_match("/[0-9]*\.[0-9]*\.[0-9]*\.[0-9]*/",$host)){ // a domain name was given, not an IP $ip=gethostbyname($host); if(!preg_match("/[0-9]*\.[0-9]*\.[0-9]*\.[0-9]*/",$ip)){ //domain could not be resolved return -1; } } $port = intval($url_p["port"]); if(!$port) $port=80; $path = $url_p["path"]; //echo "Getting " . $host . ":" . $port . $path . " ..."; $fp = fsockopen($host, $port, $errno, $errstr, 20); if(!$fp) { return false; } else { fputs($fp, "HEAD " . $url . " HTTP/1.1\r\n"); fputs($fp, "HOST: " . $host . "\r\n"); fputs($fp, "User-Agent: http://www.example.com/my_application\r\n"); fputs($fp, "Connection: close\r\n\r\n"); $headers = ""; while (!feof($fp)) { $headers .= fgets ($fp, 128); } } fclose ($fp); //echo $errno .": " . $errstr . "<br />"; $return = -2; $arr_headers = explode("\n", $headers); // echo "HTTP headers for <a href='" . $url . "'>..." . substr($url,strlen($url)-20). "</a>:"; // echo "<div class='http_headers'>"; foreach($arr_headers as $header) { // if (trim($header)) echo trim($header) . "<br />"; $s1 = "HTTP/1.1"; $s2 = "Content-Length: "; $s3 = "Location: "; if(substr(strtolower ($header), 0, strlen($s1)) == strtolower($s1)) $status = substr($header, strlen($s1)); if(substr(strtolower ($header), 0, strlen($s2)) == strtolower($s2)) $size = substr($header, strlen($s2)); if(substr(strtolower ($header), 0, strlen($s3)) == strtolower($s3)) $newurl = substr($header, strlen($s3)); } // echo "</div>"; if(intval($size) > 0) { $return=intval($size); } else { $return=$status; } // echo intval($status) .": [" . $newurl . "]<br />"; if (intval($status)==302 && strlen($newurl) > 0) { // 302 redirect: get HTTP HEAD of new URL $return=remote_file_size($newurl); } return $return; } for ($x = 0; $x < 200; $x++){ $url = a url link to video file; $filesize = remote_file_size($url); } Note I've tried two different remote_file_size ($url) functions, and I still get the same problem. Could you explains what you tried to achieve with this code? save me some reading. Quote Link to comment https://forums.phpfreaks.com/topic/37688-whats-the-best-way-to-run-a-memory-intensive-script/#findComment-180412 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.