ferret147 Posted April 20, 2010 Share Posted April 20, 2010 I am currently using a curl command to get the contents of a .txt file from a server, I am doing this every 30 seconds as this is the period of which the .txt file is updated, problem is using the curl comment is slowing my own server down, is there a less server intensive way of grabbing a .txt file form a external server? It can either be the whole .txt files or the contents, either will serve my requirements. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/ Share on other sites More sharing options...
salathe Posted April 20, 2010 Share Posted April 20, 2010 Generally, loading a text file from a remote source, using cURL or any other method, should not be slowing your server down at all. Are you sure that cURL is the problem? Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045102 Share on other sites More sharing options...
ferret147 Posted April 20, 2010 Author Share Posted April 20, 2010 It is 100% the curl command that is causing the server load. When I turn it off the load over the 4 processors goes down to 2% and when it is running it is at 86% Relatively the text file is small it contains a 32 digit number which is updated every 30 seconds so yes you would not expect it to cause any problems. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045110 Share on other sites More sharing options...
salathe Posted April 20, 2010 Share Posted April 20, 2010 Have you tried a plain file_get_conents() call instead? Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045197 Share on other sites More sharing options...
ferret147 Posted April 20, 2010 Author Share Posted April 20, 2010 For some reason I did try that command to get the .txt file but it did not work so here is what I done. Cron Page $table1 = file_get_contents("http://www.worldsnooker.com/livescoring/scores/0910WC/1.txt"); $myFile = "table1.txt"; $fh = fopen($myFile, 'w') or die("can't open file"); $stringData = $table1; fwrite($fh, $stringData); fclose($fh); Display Page using the file_get_contents command from my own server $scores = file_get_contents("http://www.thesnookerforum.co.uk/table1.txt"); $p1score = substr("$scores", -33, -30); $p2score = substr("$scores", -30, -27); $break = substr("$scores", -27, -24); $current = substr("$scores", -24, -23); $p1frames = substr("$scores", -23, -20); $p2frames = substr("$scores", -20, -17); $frameNO = substr("$scores", -17, -15); $frametimea = substr("$scores", -15, -13); $frametimeb = substr("$scores", -13, -11); How I display the data <table width="500" border="1" align="center" cellpadding="0" cellspacing="0" bgcolor="#FFFFFF"> <tr> <td bgcolor="#FFFFFF"><div align="center">Table 1 - <?php echo $frametimea;?>:<?php echo $frametimeb;?></div></td> <td><div align="center">Table 2 - <?php echo $frametimea2;?>:<?php echo $frametimeb2;?></div></td> </tr> <tr> <td bgcolor="#FFFFFF"><table width="200" border="0" align="center" cellpadding="0" cellspacing="0"> <tr> <td width="70" height="33"><div align="center" class="style5"><img src="images/p1<?php echo $current;?>.jpg" /></div></td> <td width="64"><div align="center" class="style5">Current Break <?php echo $break; ?> </div></td> <td width="66"><div align="center" class="style5"><img src="images/p2<?php echo $current;?>.jpg" /></div></td> </tr> <tr> <td height="35" bgcolor="#009933"><div align="center" class="style5">Robertson</div></td> <td bgcolor="#009933"><div align="center" class="style5">VS</div></td> <td bgcolor="#009933"><div align="center" class="style5">O'Brien</div></td> </tr> <tr> <td colspan="3"><table width="194" border="0" cellspacing="0" cellpadding="0"> <tr> <td width="42" class="style5"><div align="center"><?php echo $p1score; ?></div></td> <td width="33" class="style5"><div align="center"><strong><?php echo $p1frames; ?></strong></div></td> <td width="65" class="style5"><div align="center">Frame <?php echo $frameNO; ?></div></td> <td width="37" class="style5"><div align="center"><strong><?php echo $p2frames; ?></strong></div></td> <td width="17" class="style5"><div align="center"><?php echo $p2score; ?></div></td> </tr> </table> So the cron page gets hit every 2 minutes refreshing the data in my two .txt files which are a exact replica from the other server, then I split the data up into a displayable format - Resulting page is - http://thesnookerforum.com/scores2.php Its fine when only a few people hit it but when I put it on the public site the server will crash with 10 minutes, I do get quite alot og hits though as we are the largest independant Snooker website. This leads me to believe that infact it is the PHP commands I am using to break up the txt file which is causing the problem now and not the curl ! I hope my explenation has helped a little to understand what is going on. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045222 Share on other sites More sharing options...
JonnoTheDev Posted April 20, 2010 Share Posted April 20, 2010 It is definately your processing logic! You should seriously redesign this process. Once you poll the data you should store in a database, NOT text files which are then parsed when a user hits your website. You script is highly inefficient in the way you are parsing text files. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045239 Share on other sites More sharing options...
monkeytooth Posted April 20, 2010 Share Posted April 20, 2010 Does it have to be a .txt file for that matter? could you phase it out as an XML file and build a reader to grab the feed every time like someone else mentioned a user hits the site rather than every 30 seconds? Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045244 Share on other sites More sharing options...
JonnoTheDev Posted April 20, 2010 Share Posted April 20, 2010 Does it have to be a .txt file for that matter? could you phase it out as an XML file and build a reader to grab the feed every time like someone else mentioned a user hits the site rather than every 30 seconds? You dont want to be grabbing any remote data every time a user hits the site, this is a poor design. if the remote data is unavailable or there is a slow http response the user could be sat there for a long time or get errors. If you have many users hitting the site then why grab the same data over and over. Like I said. Have a cron job script that polls the data, parses it and records it to a database. The users on the website should view the data from the database via a simple query. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045246 Share on other sites More sharing options...
ferret147 Posted April 20, 2010 Author Share Posted April 20, 2010 I think you have hit it on the head with what I have to do Neil by also storing the data in a database I can do much more with it too. Thanks everybody for you help. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045256 Share on other sites More sharing options...
salathe Posted April 20, 2010 Share Posted April 20, 2010 Something so simple, as it is at the moment, shouldn't even be making a blip on the server resources. Reading from and writing to a database would likely use more cpu/ram! If you're really very concerned about that script, then have it generate some static HTML every time it fetches the remove text file and show that static file to your visitors. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045262 Share on other sites More sharing options...
ferret147 Posted April 20, 2010 Author Share Posted April 20, 2010 If you're really very concerned about that script, then have it generate some static HTML every time it fetches the remove text file and show that static file to your visitors. OK, so basically you are saying that the .php file with the code in which I have commented below, also make this page generate a .html file instead of running all the php every time the page is hit. Great idea, any hints as to how I would achieve this? Display Page using the file_get_contents command from my own server $scores = file_get_contents("http://www.thesnookerforum.co.uk/table1.txt"); $p1score = substr("$scores", -33, -30); $p2score = substr("$scores", -30, -27); $break = substr("$scores", -27, -24); $current = substr("$scores", -24, -23); $p1frames = substr("$scores", -23, -20); $p2frames = substr("$scores", -20, -17); $frameNO = substr("$scores", -17, -15); $frametimea = substr("$scores", -15, -13); $frametimeb = substr("$scores", -13, -11); How I display the data <table width="500" border="1" align="center" cellpadding="0" cellspacing="0" bgcolor="#FFFFFF"> <tr> <td bgcolor="#FFFFFF"><div align="center">Table 1 - <?php echo $frametimea;?>:<?php echo $frametimeb;?></div></td> <td><div align="center">Table 2 - <?php echo $frametimea2;?>:<?php echo $frametimeb2;?></div></td> </tr> <tr> <td bgcolor="#FFFFFF"><table width="200" border="0" align="center" cellpadding="0" cellspacing="0"> <tr> <td width="70" height="33"><div align="center" class="style5"><img src="images/p1<?php echo $current;?>.jpg" /></div></td> <td width="64"><div align="center" class="style5">Current Break <?php echo $break; ?> </div></td> <td width="66"><div align="center" class="style5"><img src="images/p2<?php echo $current;?>.jpg" /></div></td> </tr> <tr> <td height="35" bgcolor="#009933"><div align="center" class="style5">Robertson</div></td> <td bgcolor="#009933"><div align="center" class="style5">VS</div></td> <td bgcolor="#009933"><div align="center" class="style5">O'Brien</div></td> </tr> <tr> <td colspan="3"><table width="194" border="0" cellspacing="0" cellpadding="0"> <tr> <td width="42" class="style5"><div align="center"><?php echo $p1score; ?></div></td> <td width="33" class="style5"><div align="center"><strong><?php echo $p1frames; ?></strong></div></td> <td width="65" class="style5"><div align="center">Frame <?php echo $frameNO; ?></div></td> <td width="37" class="style5"><div align="center"><strong><?php echo $p2frames; ?></strong></div></td> <td width="17" class="style5"><div align="center"><?php echo $p2score; ?></div></td> </tr> </table> So the cron page gets hit every 2 minutes refreshing the data in my two .txt files which are a exact replica from the other server, then I split the data up into a displayable format - Resulting page is - http://thesnookerforum.com/scores2.php Its fine when only a few people hit it but when I put it on the public site the server will crash with 10 minutes, I do get quite alot og hits though as we are the largest independant Snooker website. This leads me to believe that infact it is the PHP commands I am using to break up the txt file which is causing the problem now and not the curl ! I hope my explenation has helped a little to understand what is going on. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045268 Share on other sites More sharing options...
salathe Posted April 20, 2010 Share Posted April 20, 2010 Great idea, any hints as to how I would achieve this? Sure, just do as you would normally to display a page with your PHP and save that output to a file, the latter probably makes most sense to be done when you retrieve the text files so the fetch-txt/parse-txt/convert-to-html/save-html process can all be done in one little script. Point your visitors to the saved file rather than the PHP and their requests won't even need the PHP engine at all. The script itself should be pretty straight-forward using pretty much what you already have. The output buffering functions might be useful. Quote Link to comment https://forums.phpfreaks.com/topic/199124-alternative-to-curl/#findComment-1045413 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.