RuleBritannia Posted November 13, 2013 Share Posted November 13, 2013 (edited) Hello I am using a curl script, to call another url(with get variable) on my server. Inside this url it calls, I have some code which takes a long time to execute, So to avoid script hanging I used Header: close, with some ob_flush functions etc, This allows the browser to close connection, but in the background it still proccess, the PHP. When i visit this URL with a get variable manually in the browser, this works lovely, The script imediatly stops loading, but in the background code is being executed. Now, When I use CURL in my main script to call this url(with get variables), CURL just likes to take its time and wait for everything to finish, causing my main script to hang. main.php <?php $ch = curl_init(); curl_setopt($ch, CURLOPT_URL,'http://localhost/run.php?id=44'); curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); $result = curl_exec($ch); var_dump($result); ?> run.php <?php if(isset($_GET['id'])) { ob_end_clean(); header('Content-Encoding: none'); ob_start(); echo 'Closing'; $size = ob_get_length(); header('Content-Length: ' . $size); header('Connection: close'); ob_end_flush(); ob_flush(); flush(); ob_end_clean(); //Normally this sleep would cause the script to hang for however long specified. sleep($_GET['id']); } ?> This is a simple working enviroment, Call run.php?id=20 direct, You will see it loads very fast,(correct), then use main to call run.php?id=20, you will see it waits 20 seconds for sleep to finish.Hopefully somebody has experience with this.Thanks in advance. Edited November 13, 2013 by RuleBritannia Quote Link to comment Share on other sites More sharing options...
RuleBritannia Posted November 13, 2013 Author Share Posted November 13, 2013 I managed to find one possible method, but once again do not like it as it is a cheap workaround to replicate, I would rather get the exact effect, But here is goes to those interested. If you set header('Location : http://google.com'); in your run.php file And in the main curl script, set curl_setopt($ch, CURLOPT_FOLLOWLOCATION,0); This manages to get the curl to stop executing once page loading finished, Whilst allowing further execution in the requested page. Quote Link to comment Share on other sites More sharing options...
JonnoTheDev Posted November 13, 2013 Share Posted November 13, 2013 If you have scripts that take time to process why not simply fork them to the command line. If the script is on the same server why even use CURL? i.e exec("php /path/to/script.php --param=44 > /dev/null &"); The --param is the equiv of a URL param (GET) In your CLI script you can get the param value using this function function arguments($argv) { $_ARG = array(); foreach($argv as $arg) { if(preg_match('/--[a-zA-Z0-9]*=.*/', $arg)) { $str = preg_split('/=/',$arg); $arg = ''; $key = preg_replace('/--/','',$str[0]); for($i = 1; $i < count($str); $i++) { $arg .= $str[$i]; } $_ARG[$key] = $arg; } elseif(preg_match('/-[a-zA-Z0-9]/',$arg)) { $arg = preg_replace('/-/','',$arg); $_ARG[$arg] = 'true'; } } return $_ARG; } Usage $_ARGS = arguments($argv); // prints 44 echo $_ARGS['param']; Quote Link to comment Share on other sites More sharing options...
RuleBritannia Posted November 13, 2013 Author Share Posted November 13, 2013 If you have scripts that take time to process why not simply fork them to the command line. If the script is on the same server why even use CURL? i.e exec("php /path/to/script.php --param=44 > /dev/null &"); The --param is the equiv of a URL param (GET) In your CLI script you can get the param value using this function function arguments($argv) { $_ARG = array(); foreach($argv as $arg) { if(preg_match('/--[a-zA-Z0-9]*=.*/', $arg)) { $str = preg_split('/=/',$arg); $arg = ''; $key = preg_replace('/--/','',$str[0]); for($i = 1; $i < count($str); $i++) { $arg .= $str[$i]; } $_ARG[$key] = $arg; } elseif(preg_match('/-[a-zA-Z0-9]/',$arg)) { $arg = preg_replace('/-/','',$arg); $_ARG[$arg] = 'true'; } } return $_ARG; } Usage $_ARGS = arguments($argv); // prints 44 echo $_ARGS['param']; I havnt done any CLI php since last week, but I am moving in that direction as it seems more appropriate than to keep using CURL and some meta refresh as a cron service. In regards to your function, That looks very heavy, Is that the "normal" way of getting parameters from a php cli request? the comparison of $_GET['id'] to your function is a big markup diff Quote Link to comment Share on other sites More sharing options...
JonnoTheDev Posted November 13, 2013 Share Posted November 13, 2013 In regards to your function, That looks very heavy, Is that the "normal" way of getting parameters from a php cli request? the comparison of $_GET['id'] to your function is a big markup diff In the CLI there is no $_GET. The equivalent is $argv http://php.net/manual/en/reserved.variables.argv.php However, this does not work using name, value pairs as URL parameters do i.e script.php?x=1&y=2 To get the value of x and y I would use $_GET['x'], $_GET['y'] In the command line I cannot define x or y as parameter names. I can only supply values i.e /path/to/script.php 1 2 To get the parameters in a command line script I would use $argv[0], $argv[1] So to put it in a nutshell, the function I have given you allows you to pass parameters to a script in the command line using name, value pairs. You can see that the function is taking $argv as its parameter. It is totally up to you how you want to deal with parameters on the command line. Here is a bit more explanation http://stackoverflow.com/questions/9612166/passing-command-line-arguments-to-a-php-script Quote Link to comment Share on other sites More sharing options...
RuleBritannia Posted November 13, 2013 Author Share Posted November 13, 2013 Ok im researching cli php now. Also it seems the partial fix I posted earlier doesnt even work 100% of the time, F*!@~# For now I have to progress, So I am using a different Pikey fix, for those interested, set curl_setopt($ch,CURLOPT_TIMEOUT,1); in the main script. This will not allow CURL to spend its time waiting on a page to execute, Incase anybody is wondering why its a bad fix, well thats because if there is a connection delay, CURL may not even have time to send the initial request. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.