cmgriffing Posted December 15, 2010 Share Posted December 15, 2010 I have some cron jobs that pull xml and some webcam images from their respective source sites (NOAA, DOT) and saved to the local server. This is done to ease the strain on the external sites in the event of a user spike on our website. Now, the code works most of the time, but you can see that I dont have any error handling. Sometimes, one of the webcam images will fail to load, and other times I have seen the weather xml feed fail producing some unaestetic php error codes on the site until a half hour later when the cron job runs again. My questions: -Whats the best way to make it try again if it fails? -Are the set time limit statements necessary? I did it as a keep alive. -I also put the sleep statements there to space things out, since there are 6 webcam scripts that run. Anyway, I would appreciate any suggestions. Thanks -Chris [attachment deleted by admin] Quote Link to comment Share on other sites More sharing options...
cmgriffing Posted December 18, 2010 Author Share Posted December 18, 2010 So far 31 views, and the only comment is one word long. This here is a bump to this thread so that I can hopefully get some advice. Quote Link to comment Share on other sites More sharing options...
parino_esquilado Posted December 18, 2010 Share Posted December 18, 2010 No one wants to download your files. Too much hassle. Just embed them using the [code=php:0] [/code] tags. Quote Link to comment Share on other sites More sharing options...
cmgriffing Posted December 19, 2010 Author Share Posted December 19, 2010 Alright here is my code in php tags: This is the forecast cron job. <?php set_time_limit(60); //Stevens Weather copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=47.75&lon=-121.09", "stvns_wthr.xml"); sleep(20); set_time_limit(60); copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=47.43&lon=-121.41", "snoq_wthr.xml"); sleep(20); set_time_limit(60); copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=47.28&lon=-120.42", "mssn_wthr.xml"); sleep(20); set_time_limit(60); copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=45.34&lon=-121.72", "baker_wthr.xml"); sleep(20); set_time_limit(60); copy("http://www.wrh.noaa.gov/forecast/xml/xml.php?duration=96&interval=6&lat=46.9&lon=-121.51", "crstl_wthr.xml"); ?> Here is one of the webcam scripts. <?php set_time_limit(60); //Stevens Webcams copy("http://www.stevenspass.com/Stevens/SiteAssets/_ftp/webcam/stevenspass.jpg", "stevenspass.jpg"); sleep(10); set_time_limit(60); copy("http://www.stevenspass.com/Stevens/SiteAssets/_ftp/webcam/stevenspass2.jpg", "stevenspass2.jpg"); sleep(10); set_time_limit(60); copy("http://images.wsdot.wa.gov/us2/Stevens/sumteast.jpg", "sumteast.jpg"); sleep(10); set_time_limit(60); copy("http://images.wsdot.wa.gov/us2/stvldg/sumtwest.jpg", "sumtwest.jpg"); ?> So my questions again are: 1) Are the set time limit statements necessary for keeping the script from timing out if I have no access to the php.ini to change timeout length? 2) How should I implement error handling that retries the fetching of a file if it fails? My first guess is a try/catch, but I havent used those before. Thanks. -Chris Quote Link to comment Share on other sites More sharing options...
trq Posted December 19, 2010 Share Posted December 19, 2010 My first question would be why are you using php for this? Quote Link to comment Share on other sites More sharing options...
cmgriffing Posted December 20, 2010 Author Share Posted December 20, 2010 Im using php because I am familiar with its syntax and my web server supports it. What else should I be using on a shared-hosting LAMP environment? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.