ShivaGupta Posted August 17, 2013 Share Posted August 17, 2013 Have 100 pcs of php file that collect some information of defferent sites then print 1 line result.each php taking 5 miniuts for result . and Want to run all (100 pcs) of php files in same times. so i found this way to run all php . make a master php then inclued their all files //Example of how to include more than 10 files include('file1.php'); include('file2.php'); include('file1.php'); include('file3.php'); include('file4.php'); include('file5.php'); include('file6.php'); include('file7.php'); include('file8.php'); include('file9.php'); include('file10.php'); include('file11.php'); include('file12.php'); include('file13.php'); include('file14.php'); so here is my question what is duration of master php? 5 miniuts or 5x100 = 500 miniuts? plz help me to short out this quection. and in lost sorry for my bad english.becaus I AM UNCOMPTAIBLE WITH ENGLISH i realy dont know how to ask a QUECTION. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/ Share on other sites More sharing options...
davidannis Posted August 17, 2013 Share Posted August 17, 2013 file2.php will not execute until file1.php is done executing. You could probably spin off a background process in file1, but then getting the output back and echoing it would be hard. This may help point you in the right direction: http://braincrafted.com/php-background-processes/ Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1445547 Share on other sites More sharing options...
kicken Posted August 17, 2013 Share Posted August 17, 2013 What part of the processes takes so long? If it is downloading the data, then you can setup a system using cURL to download from several sites in parallel. If it is the actual processing of the results then you may have to step into the multi-process realm to kick off several child processes. The first step before any of that though is to make sure your code that does the processing is as efficient as it can be. Make sure you're not doing something silly like queries within loops, or unnecessary nested loops. Use string functions where possible rather than regex, minimize IO time, etc. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1445549 Share on other sites More sharing options...
ShivaGupta Posted August 24, 2013 Author Share Posted August 24, 2013 file2.php will not execute until file1.php is done executing. You could probably spin off a background process in file1, but then getting the output back and echoing it would be hard. This may help point you in the right direction: http://braincrafted.com/php-background-processes/ i dont want to getting the output back and echoing.i want to php execute as background process @same time..plz help me out. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1446582 Share on other sites More sharing options...
ShivaGupta Posted August 24, 2013 Author Share Posted August 24, 2013 What part of the processes takes so long? If it is downloading the data, then you can setup a system using cURL to download from several sites in parallel. If it is the actual processing of the results then you may have to step into the multi-process realm to kick off several child processes. The first step before any of that though is to make sure your code that does the processing is as efficient as it can be. Make sure you're not doing something silly like queries within loops, or unnecessary nested loops. Use string functions where possible rather than regex, minimize IO time, etc. ok sir i foud this. but this is recommended? put an invisible image somewhere on the page pointing to the url that needs to run in the background, like this: <img src="run-in-background.php" border="0" alt="" width="1" height="1" /> Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1446584 Share on other sites More sharing options...
davidannis Posted August 26, 2013 Share Posted August 26, 2013 The solution: <img src="run-in-background.php" border="0" alt="" width="1" height="1" /> will run your one script in the background. To run 100 scripts, you'd need 100 images. You also would leave yourself open to running a server out of resources running 100 scripts on each page load with no check to see if the user already set off 100 previously and is reloading the page. I would recommend following the link I provided in post #2 and using php to kick off background processes after making sure a user doesn't already have 100 processes running. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1446837 Share on other sites More sharing options...
ShivaGupta Posted August 27, 2013 Author Share Posted August 27, 2013 The solution: <img src="run-in-background.php" border="0" alt="" width="1" height="1" /> will run your one script in the background. To run 100 scripts, you'd need 100 images. You also would leave yourself open to running a server out of resources running 100 scripts on each page load with no check to see if the user already set off 100 previously and is reloading the page. I would recommend following the link I provided in post #2 and using php to kick off background processes after making sure a user doesn't already have 100 processes running. exec(sprintf("%s > %s 2>&1 & echo $! >> %s", $cmd, $outputfile, $pidfile)); This launches the command $cmd, redirects the command output to $outputfile, and writes the process id to $pidfile. That lets you easily monitor what the process is doing and if it's still running. function isRunning($pid){ try{ $result = shell_exec(sprintf("ps %d", $pid)); if( count(preg_split("/\n/", $result)) > 2){ return true; } }catch(Exception $e){} return false; } Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1447018 Share on other sites More sharing options...
ShivaGupta Posted August 27, 2013 Author Share Posted August 27, 2013 so here i want help again how to work with above code? Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1447022 Share on other sites More sharing options...
davidannis Posted August 29, 2013 Share Posted August 29, 2013 To modify the code that I pointed to to execute 100 files per your original post try something like namespace Bc\BackgroundProcess; class BackgroundProcess { private $command; private $pid; public function __construct($command) { $this->command = $command; } public function run($outputFile = '/dev/null') { $this->pid = shell_exec(sprintf( '%s > %s 2>&1 & echo $!', $this->command, $outputFile )); } public function isRunning() { try { $result = shell_exec(sprintf('ps %d', $this->pid)); if(count(preg_split("/\n/", $result)) > 2) { return true; } } catch(Exception $e) {} return false; } public function getPid() { return $this->pid; } } //It’s now relatively easy to execute a command in a background process: use Bc\BackgroundProcess\BackgroundProcess; for ($x=1 ; $x<101 ; $x++){ $c='php -f file'.$x.'.php'; $process = new BackgroundProcess('sleep 5'); $process->run(); } of course, this example does nothing with the output from those 100 php files. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1447228 Share on other sites More sharing options...
davidannis Posted August 29, 2013 Share Posted August 29, 2013 oops, sleep deprived and working fast $c='php -f file'.$x.'.php'; $process = new BackgroundProcess('sleep 5'); should be: $c='php -f file'.$x.'.php'; $process = new BackgroundProcess($c); Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1447242 Share on other sites More sharing options...
ShivaGupta Posted September 2, 2013 Author Share Posted September 2, 2013 oops, sleep deprived and working fast $c='php -f file'.$x.'.php'; $process = new BackgroundProcess('sleep 5'); should be: $c='php -f file'.$x.'.php'; $process = new BackgroundProcess($c); ok sir thank you for beter help........ now i am going with this code....... <?php namespace Bc\BackgroundProcess; class BackgroundProcess { private $command; private $pid; public function __construct($command) { $this->command = $command; } public function run($outputFile = '/dev/null') { $this->pid = shell_exec(sprintf( '%s > %s 2>&1 & echo $!', $this->command, $outputFile )); } public function isRunning() { try { $result = shell_exec(sprintf('ps %d', $this->pid)); if(count(preg_split("/\n/", $result)) > 2) { return true; } } catch(Exception $e) {} return false; } public function getPid() { return $this->pid; } } //It’s now relatively easy to execute a command in a background process: use Bc\BackgroundProcess\BackgroundProcess; for ($x=1 ; $x<101 ; $x++){ $c='php -f file'.$x.'.php'; $process = new BackgroundProcess($c); $process->run(); } //for dispay countdown in secondes include_once ('countdown.php'); ?> but here i have some doubt and need help again to clear them. 1.can i run this code on windows server 2003 with bitnami wampp server ? 2.is their any special ini setings required on my local host for runing this type long background prosess eg set_time_limit(0); ob_implicit_flush(true); ignore_user_abort(true);? Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1447846 Share on other sites More sharing options...
kicken Posted September 2, 2013 Share Posted September 2, 2013 1.can i run this code on windows server 2003 with bitnami wampp server ? No, it relies on linux system commands and conventions. 2.is their any special ini setings required on my local host for runing this type long background prosess eg set_time_limit(0); If you are trying to run it via CLI, just set_time_limit(0). If you want to run it through your server you'd probably want ignore_user_abort(true) as well and some kind of output. Trying to execute a long-running script through the web server is generally not a good idea to start with. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1447880 Share on other sites More sharing options...
ShivaGupta Posted September 7, 2013 Author Share Posted September 7, 2013 No, it relies on linux system commands and conventions. If you are trying to run it via CLI, just set_time_limit(0). If you want to run it through your server you'd probably want ignore_user_abort(true) as well and some kind of output. Trying to execute a long-running script through the web server is generally not a good idea to start with. ok Thank You sir . now i will try this code on linux hosting but any php not excut.............. i think i have misiing something on code .......... i am working with this code since last 30 days but steel no luck.... any php file not excut .......... may be some thing wrong with this line or path. $c='php -f file'.$x.'.php'; and i fell boring with BackgroundProcess subject.. so plz need help again ............ Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1448566 Share on other sites More sharing options...
vinny42 Posted September 7, 2013 Share Posted September 7, 2013 What do the scripts do? Whagt makes you think that your server can handle running 100 scripts at the same time? It sounds like a very bad idea. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1448579 Share on other sites More sharing options...
ShivaGupta Posted September 8, 2013 Author Share Posted September 8, 2013 What do the scripts do? Whagt makes you think that your server can handle running 100 scripts at the same time? It sounds like a very bad idea. ya i agree sir its very bad idea. so plz help me only for 5 or 6 back ground prosess. Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1448652 Share on other sites More sharing options...
vinny42 Posted September 8, 2013 Share Posted September 8, 2013 What do the scripts do? Link to comment https://forums.phpfreaks.com/topic/281280-want-to-run-100-pcs-of-php-files-in-same-times/#findComment-1448654 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.