Michdd Posted August 9, 2009 Share Posted August 9, 2009 I have a script that can potentially run for many hours, this isn't a problem because it's collecting information and depending on how much I want to collect I can stop it whenever I want. So I used set_time_limit(0) so it isn't stopped. This works fine on my local server, however when uploaded to my hosting it stops at a certain point, without displaying an error. It just stops. Are there any other setting or anything on the web server that could potentially stop this script from executing after X amount of minutes? Because the time that it runs seems pretty random. Sometimes it'll only run for a minute or 2, sometimes over 30 minutes. Any ideas on what could be causing this? Quote Link to comment https://forums.phpfreaks.com/topic/169508-solved-time-limit/ Share on other sites More sharing options...
Michdd Posted August 9, 2009 Author Share Posted August 9, 2009 I kind of figured out the problem. My script relies on getting resources from an outside source via file_get_contents(external link), and my host seems to terminate a script when a connection takes more than 30 seconds. Is there any way that I can make it so that if a file_get_contents request takes about that long it'll just skip it? Or retry? So my script isn't completely stopped. Quote Link to comment https://forums.phpfreaks.com/topic/169508-solved-time-limit/#findComment-894364 Share on other sites More sharing options...
Grayda Posted August 10, 2009 Share Posted August 10, 2009 Your best bet is to contact your host and let them know you'll be doing a lot of heavy processing, then try and break up your script so it'll work on it's task in chunks. If, for example you're collecting words from long text documents, you could find the length of the document, open only 1/4, 1/8, 1/100, 1/1000000 of the file (depending on it's length) and read that small bit and process it. Then you store the offset so when your script runs again, it'll start from the 2/4, 2/8, 2/100 or whatever you split your task into. Could there also be another place you can source your data from, already compiled for you? Either way, you're going to have to re-write your script or face having your hosting terminated for using up a shared host's resources (P.S, another thought would be to have your own Virtual Private Server -- VPS. I don't know if they have limits on time but that might be what you're after. More pricey though..) Quote Link to comment https://forums.phpfreaks.com/topic/169508-solved-time-limit/#findComment-894460 Share on other sites More sharing options...
Michdd Posted August 10, 2009 Author Share Posted August 10, 2009 I've been considering getting a VPS, and I probably will soon. But now it's not necessary. My solution was really simple, actually. Instead of using file_get_contents I changed to a cURL method, which allows to to set the time out of the connection time to 25 seconds, just under when my host would terminate the script. So it all works out fine. Thanks. Quote Link to comment https://forums.phpfreaks.com/topic/169508-solved-time-limit/#findComment-894465 Share on other sites More sharing options...
chmpdog Posted August 10, 2009 Share Posted August 10, 2009 you could also have edited the php.ini file on your root. There is a specific line for setting the max. Quote Link to comment https://forums.phpfreaks.com/topic/169508-solved-time-limit/#findComment-894626 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.