termin8r Posted February 23, 2007 Share Posted February 23, 2007 Hi, I have a basic website on a shared server, and I want to write some server side php which will monitor another website at regular intervals and email me if certain updates have been performed on that monitored website. I have written the php script which uses sleep etc. and accesses the monitored website once every 30 mins, but how can I keep it running indefinitely? If I kick it off via a webpage then the webpage will time out and I presume the script will stop. I only have ftp and http access to the shared server. Please excuse my ignorance, but I am very new to php and server side scripts. Any help appreciated. Quote Link to comment Share on other sites More sharing options...
TRI0N Posted February 23, 2007 Share Posted February 23, 2007 Not sure what you mean if you kick off the server that is running the php monitor it will crash? Sure that is going to happen.. Or do you mean if you kick off the server but the php monitor is running from a differnt server it crashes? Have you just tried to use a simple meta Statement: <META HTTP-EQUIV='REFRESH' CONTENT='1800';URL=YourPageHere.php'> That will refresh your page every 30 mins. Remember to replace the YourPageHere with the page that is reloading. Quote Link to comment Share on other sites More sharing options...
termin8r Posted February 23, 2007 Author Share Posted February 23, 2007 OK. I don't think I explained it very well. The server is "rented" from a UK hosting company so I have no control over it, and I only have ftp and http access. I can run php scripts on it and have done so by typing the script name into a browser eg. monitor.php. The browser waits for the script to finish. The thing is, I don't want the script to finish. I want it to run continuously, sleeping mostly then using curl to fetch a page from another site, doing some parsing and if certain content matches my criteria, it will email me and continue running. So ideally I want to have some way of kicking it off and leaving it running. I do not want to have a browser up and have it re-run the php every 30 mins as I don't have a home based system that is up 24hrs a day! Of course if it was my own server in my own house then I could just use php through the CLI, but it isn't so I can't. Hope this is clearer now. I am truly stumped! Quote Link to comment Share on other sites More sharing options...
monk.e.boy Posted February 23, 2007 Share Posted February 23, 2007 Without access to the command line I don't think you can do this. The other solution is to add some logic to your home page: 1. render home page. 2. access DB get last check time 3. Is last check time + 60 minutes < current time 4. Make report. 5. Email report. 6. Set last check time (in DB) to current time So every home page request *may* send you a report. See how it works? monk.e.boy Quote Link to comment Share on other sites More sharing options...
mbtaylor Posted February 23, 2007 Share Posted February 23, 2007 If you can get the host to install a cron job for you, then you can set a cron task to call your php script every X minutes. Quote Link to comment Share on other sites More sharing options...
termin8r Posted February 23, 2007 Author Share Posted February 23, 2007 Without access to the command line I don't think you can do this. The other solution is to add some logic to your home page: 1. render home page. 2. access DB get last check time 3. Is last check time + 60 minutes < current time 4. Make report. 5. Email report. 6. Set last check time (in DB) to current time So every home page request *may* send you a report. See how it works? monk.e.boy Yes, I see how it works and it is a nice idea, however the site is hit less than Hilary Clintons website is in Texas!! Quote Link to comment Share on other sites More sharing options...
Jessica Posted February 23, 2007 Share Posted February 23, 2007 Use a cron job. Quote Link to comment Share on other sites More sharing options...
termin8r Posted February 23, 2007 Author Share Posted February 23, 2007 Use a cron job. OK. Anyone any experience on how open big hosting companies are to these type of requests? I use 1&1. Quote Link to comment Share on other sites More sharing options...
mbtaylor Posted February 23, 2007 Share Posted February 23, 2007 Try phoning their tech support and asking if they can do it I have a dedicated linux server with 1&1 which lets me do anything I want - not bad price too. I recommend that for any serious development. Quote Link to comment Share on other sites More sharing options...
Jessica Posted February 23, 2007 Share Posted February 23, 2007 You can also use third-party services to run your cron job if you set it up using wget. Any decent host will let you do cron jobs. GoDaddy's cheapest 2.99 plan lets you do cron jobs. My husband has used 1&1 but I don't know what their setup was. Quote Link to comment Share on other sites More sharing options...
mbtaylor Posted February 23, 2007 Share Posted February 23, 2007 You can also use third-party services to run your cron job if you set it up using wget. Interesting, how would you use cron via wget? I have only used wget to download files. Quote Link to comment Share on other sites More sharing options...
monk.e.boy Posted February 23, 2007 Share Posted February 23, 2007 A 3rd party would let you add cron jobs. You set this cron job to wget the report page from the original site. Job done. monk.e.boy Quote Link to comment Share on other sites More sharing options...
chrisredding Posted February 24, 2007 Share Posted February 24, 2007 In my opinion, if your webhost doesn't allow CGI, PHP mail() or cron, you should leave them. Certainly in the UK you can get very good hosts, that support loads of options. I'm not slagging off the rest of the world, the UK is my own area of expertise in terms of web hosts. Innit Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.