xm1 Posted August 12, 2008 Share Posted August 12, 2008 php5.2.6 freebsd 5.5 apache2 ive read alot of posts where people have had no problems doing what im trying to do. I want php to tell a perl script to launch in the background and do its job. the problem is php is still sitting around looking for a response. The page submits to itself and calls the script using either system or exec or shell_exec. But then instead of just returning to the page, it sits there blank waiting for the perl to complete. these are some of the different ways ive tried to launch: exec('/usr/home/website/app/databaseImport &'); //page hangs system('/usr/home/website/app/databaseImport &'); //page hangs shell_exec('/usr/home/website/app/databaseImport &'); //page hangs exec("nohup /usr/home/website/app/databaseImport 1>/dev/null/ 2>&1 &"); //error :cannot create /dev/null/: Not a directory exec("nohup /usr/home/website/app/databaseImport 1 &"); //page hangs I tried making a second perl that launches the import batch perl using exec, and then i called that perl using both exec and system from php. no matter what i do php just sits there and waits. exec('/usr/home/website/app/databaseInit &');//page hangs shell_exec('/usr/home/website/app/databaseInit &');//page hangs if i just launch databaseInit from the command line, it fires off the exec to the batch and keeps going the way i expect. but ask php to do it and it just sits there blankly waiting for a return value. the batch itself runs for a really long time because it is importing a very large database that the php uploads. if i reduce this script down to a simple oneliner the php comes back pretty quick. but the point is i want php to fire and forget, not sit around 20 minutes for the batch to give an answer. anyone have clues if theres a php.ini setting i need to fix or a better way around this. Quote Link to comment Share on other sites More sharing options...
Jabop Posted August 12, 2008 Share Posted August 12, 2008 How is /dev/null not a directory? Quote Link to comment Share on other sites More sharing options...
Jabop Posted August 12, 2008 Share Posted August 12, 2008 Should be shell_exec('script >& /dev/null') Quote Link to comment Share on other sites More sharing options...
ignace Posted August 12, 2008 Share Posted August 12, 2008 the batch itself runs for a really long time because it is importing a very large database that the php uploads. if i reduce this script down to a simple oneliner the php comes back pretty quick. but the point is i want php to fire and forget, not sit around 20 minutes for the batch to give an answer. nope that is not possible, a programming language fires off every command one by one and wait for it to complete and then continues to the next one, so when you call exec() it waits for the script to return a response, reacts on that response and on success continues to the next command. So when you exec() is fired and the perl script executes, you will have to sit out the whole ride until the perl script finishes or when the execution time times out, that however you can change, see: http://be.php.net/set_time_limit EDIT: it however seems to be possible, but you need to work with threats the same way a unix system works, nothing found on google yet about creating threads in php Quote Link to comment Share on other sites More sharing options...
xm1 Posted August 12, 2008 Author Share Posted August 12, 2008 Should be shell_exec('script >& /dev/null') Syntax error: Bad fd number Quote Link to comment Share on other sites More sharing options...
DarkWater Posted August 12, 2008 Share Posted August 12, 2008 shell_exec("/usr/home/website/app/databaseImport > /dev/null 2>&1"); Does that give any errors? Quote Link to comment Share on other sites More sharing options...
xm1 Posted August 12, 2008 Author Share Posted August 12, 2008 shell_exec("/usr/home/website/app/databaseImport > /dev/null 2>&1"); Does that give any errors? No errors. That one fires off the perl and the php just sits there again waiting for a response. i then tried it with nohup in case that would matter but it still just hangs there. Quote Link to comment Share on other sites More sharing options...
mikeburns Posted October 17, 2008 Share Posted October 17, 2008 I know it's an old post, but I have done some testing on my own and found that this works quit well: exec("find / | wc -l 1>/tmp/null 2>/tmp/null &"); this operation (counting all the files on the server) should take forever, but when I execute this, it appears to run in the background. Quote Link to comment Share on other sites More sharing options...
mikeburns Posted October 17, 2008 Share Posted October 17, 2008 and if you need to execute two or more commands in sequence, use this: exec("find / | wc -l 1>/tmp/null 2>/tmp/null && echo 'done' >>/tmp/null &"); Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.