Jump to content

Want to run 100 pcs of php files in same times.


ShivaGupta

Recommended Posts

Have 100 pcs of php file that  collect some information of defferent sites then print 1 line result.each php taking 5 miniuts for  result .

 

 

and Want to run all (100 pcs) of php files in same times.

 

 

so i found this way to run all php .

 

make a master php then inclued their all files

//Example of how to include more than 10 files
include('file1.php');
include('file2.php');
include('file1.php');
include('file3.php');
include('file4.php');
include('file5.php');
include('file6.php');
include('file7.php');
include('file8.php');
include('file9.php');
include('file10.php');
include('file11.php');
include('file12.php');
include('file13.php');
include('file14.php');

so here is my question what is duration of master php?  5 miniuts or 5x100 = 500 miniuts?

 

 

plz help me to short out this quection.

 

 

and in lost sorry for my bad english.becaus I AM UNCOMPTAIBLE WITH ENGLISH i realy dont know how to ask a QUECTION.

Edited by ShivaGupta
Link to comment
Share on other sites

file2.php will not execute until file1.php is done executing. You could probably spin off a background process in file1, but then getting the output back and echoing it would be hard.

 

This may help point you in the right direction:

http://braincrafted.com/php-background-processes/

Edited by davidannis
Link to comment
Share on other sites

What part of the processes takes so long? If it is downloading the data, then you can setup a system using cURL to download from several sites in parallel. If it is the actual processing of the results then you may have to step into the multi-process realm to kick off several child processes.

 

The first step before any of that though is to make sure your code that does the processing is as efficient as it can be. Make sure you're not doing something silly like queries within loops, or unnecessary nested loops. Use string functions where possible rather than regex, minimize IO time, etc.

Link to comment
Share on other sites

file2.php will not execute until file1.php is done executing. You could probably spin off a background process in file1, but then getting the output back and echoing it would be hard.

 

This may help point you in the right direction:

http://braincrafted.com/php-background-processes/

i dont want to getting the output back and echoing.i want to

php execute as background process @same time..plz help me out.

 

 

Edited by ShivaGupta
Link to comment
Share on other sites

What part of the processes takes so long? If it is downloading the data, then you can setup a system using cURL to download from several sites in parallel. If it is the actual processing of the results then you may have to step into the multi-process realm to kick off several child processes.

 

The first step before any of that though is to make sure your code that does the processing is as efficient as it can be. Make sure you're not doing something silly like queries within loops, or unnecessary nested loops. Use string functions where possible rather than regex, minimize IO time, etc.

ok sir

i foud this.

but this is recommended?

put an invisible image somewhere on the page pointing to the url that needs to run in the background, like this:

<img src="run-in-background.php" border="0" alt="" width="1" height="1" />
Edited by ShivaGupta
Link to comment
Share on other sites

The solution:

<img src="run-in-background.php" border="0" alt="" width="1" height="1" />

will run your one script in the background. To run 100 scripts, you'd need 100 images. You also would leave yourself open to running a server out of resources running 100 scripts on each page load with no check to see if the user already set off 100 previously and is reloading the page. I would recommend following the link I provided in post #2 and using php to kick off background processes after making sure a user doesn't already have 100 processes running.

Link to comment
Share on other sites

The solution:

<img src="run-in-background.php" border="0" alt="" width="1" height="1" />

will run your one script in the background. To run 100 scripts, you'd need 100 images. You also would leave yourself open to running a server out of resources running 100 scripts on each page load with no check to see if the user already set off 100 previously and is reloading the page. I would recommend following the link I provided in post #2 and using php to kick off background processes after making sure a user doesn't already have 100 processes running.






exec(sprintf("%s > %s 2>&1 & echo $! >> %s", $cmd, $outputfile, $pidfile));

This launches the command $cmd, redirects the command output to $outputfile, and writes the process id to $pidfile.

That lets you easily monitor what the process is doing and if it's still running.

function isRunning($pid){
    try{
        $result = shell_exec(sprintf("ps %d", $pid));
        if( count(preg_split("/\n/", $result)) > 2){
            return true;
        }
    }catch(Exception $e){}

    return false;
}


Edited by ShivaGupta
Link to comment
Share on other sites

To modify the code that I pointed to to execute 100 files per your original post try something like

namespace Bc\BackgroundProcess;

class BackgroundProcess
{
    private $command;
    private $pid;

    public function __construct($command)
    {
        $this->command = $command;
    }

    public function run($outputFile = '/dev/null')
    {
        $this->pid = shell_exec(sprintf(
            '%s > %s 2>&1 & echo $!',
            $this->command,
            $outputFile
        ));
    }

    public function isRunning()
    {
        try {
            $result = shell_exec(sprintf('ps %d', $this->pid));
            if(count(preg_split("/\n/", $result)) > 2) {
                return true;
            }
        } catch(Exception $e) {}

        return false;
    }

    public function getPid()
    {
        return $this->pid;
    }
}

//It’s now relatively easy to execute a command in a background process:

use Bc\BackgroundProcess\BackgroundProcess;

for ($x=1 ; $x<101 ; $x++){
$c='php -f  file'.$x.'.php'; 

$process = new BackgroundProcess('sleep 5');
$process->run();
}

of course, this example does nothing with the output from those 100 php files.

Link to comment
Share on other sites

 

oops, sleep deprived and working fast

$c='php -f  file'.$x.'.php'; 

$process = new BackgroundProcess('sleep 5');

should be:

$c='php -f  file'.$x.'.php'; 

$process = new BackgroundProcess($c);

ok sir thank you for beter help........

 

now i am going with this code.......

<?php
namespace Bc\BackgroundProcess;

class BackgroundProcess
{
    private $command;
    private $pid;

    public function __construct($command)
    {
        $this->command = $command;
    }

    public function run($outputFile = '/dev/null')
    {
        $this->pid = shell_exec(sprintf(
            '%s > %s 2>&1 & echo $!',
            $this->command,
            $outputFile
        ));
    }

    public function isRunning()
    {
        try {
            $result = shell_exec(sprintf('ps %d', $this->pid));
            if(count(preg_split("/\n/", $result)) > 2) {
                return true;
            }
        } catch(Exception $e) {}

        return false;
    }

    public function getPid()
    {
        return $this->pid;
    }
}

//It’s now relatively easy to execute a command in a background process:

use Bc\BackgroundProcess\BackgroundProcess;

for ($x=1 ; $x<101 ; $x++){
$c='php -f  file'.$x.'.php'; 

$process = new BackgroundProcess($c);
$process->run();
}
//for dispay countdown in secondes
include_once ('countdown.php');
?>

but here i have some doubt

and need help again to clear them.

 

 

1.can i run this code on windows server 2003 with bitnami wampp server ?

2.is their any special ini setings required on my local host for runing this type long background prosess eg set_time_limit(0);

ob_implicit_flush(true);

ignore_user_abort(true);?

Edited by ShivaGupta
Link to comment
Share on other sites

1.can i run this code on windows server 2003 with bitnami wampp server ?

No, it relies on linux system commands and conventions.

 

2.is their any special ini setings required on my local host for runing this type long background prosess eg set_time_limit(0);

If you are trying to run it via CLI, just set_time_limit(0). If you want to run it through your server you'd probably want ignore_user_abort(true) as well and some kind of output. Trying to execute a long-running script through the web server is generally not a good idea to start with.

Link to comment
Share on other sites

No, it relies on linux system commands and conventions.

 

 

If you are trying to run it via CLI, just set_time_limit(0). If you want to run it through your server you'd probably want ignore_user_abort(true) as well and some kind of output. Trying to execute a long-running script through the web server is generally not a good idea to start with.

ok Thank You  sir .

 

now i will try this code on linux hosting but   any php not excut..............

i think i have misiing something on code ..........

i am working with this code since last 30 days but steel no luck....

 

any php file not excut     ..........

 

may be some thing wrong with this line or path.

$c='php -f  file'.$x.'.php';

and i fell boring with BackgroundProcess subject..

 

 

so plz need help again ............

Edited by ShivaGupta
Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.