bpbp Posted September 15, 2007 Share Posted September 15, 2007 Hi to all. I use the following code on a classifieds site. It checks for a backlinkevery 5 days up and gives up after 3 tries if it doesn't find a link back to my site. I run this code in the background using require 'adlinkcheck.php'; to load this script when someone is viewing an ad on the site. <? //filename: adlinkcheck.php $check_link=false; if(!($ad['website']=="" || $ad['link_try']>=3)) { $days=Ad::GetDaysLastCheckWebsite2($ad['link_checked_on']); if($ad['link_checked_on']=="0000-00-00 00:00:00" || $days>=5){ $website=str_replace("http://", "", str_replace("www.", "", $ad['website'])); $web_file=$website; if(strpos($website, "/")) $website=substr($website, 0, strpos($website, "/")); $host=gethostbyname($website); if($host!=$website){ $html=''; $f=@fopen("http://".(strpos($ad['website'], "www")!==false ? "www." : "").$web_file, "r"); while(!@feof($f)){ $html.=@fgets($f); } @fclose($f); $html=strtolower($html); $html_arr=explode("<a ", $html); for($i=0; $i<count($html_arr); $i++){ $v=$html_arr[$i]; $v=substr($v, 0, strpos($v, ">")); $arr_v=explode(" ", $v); foreach ($arr_v as $vv){ $vv=str_replace("\"", "", str_replace("'", "", $vv)); if(strpos($vv, "href=")===false) continue; foreach ($config['self_url'] as $url){ if(strpos($vv, $url)!==false) $check_link=true; } } } $ad['link_checked_on']=date('Y-m-d H:i:s'); if ($check_link){ $ad['link_try']=0; $ad['link_active']=1; } else { $ad['link_try'] = (int)$ad['link_try']+1; $ad['link_active']=0; } Ad::SaveLinkData($ad['link_checked_on'], $ad['link_try'], $ad['link_active'], $ad['id']); } } } ?> Here is the issue: This script will use 100% of the cpu, in a way it seem like the script gets stuck. without this script running my server cpu load is 10% Any ideas? Quote Link to comment Share on other sites More sharing options...
Jessica Posted September 15, 2007 Share Posted September 15, 2007 try removing the error suppressors (the @) and see if there are any errors. Quote Link to comment Share on other sites More sharing options...
bpbp Posted September 18, 2007 Author Share Posted September 18, 2007 I've switched the code to use CURL and now there is no more high loads on the cpu. Its much better just for the fact that I can set a time out Removed: $f=@fopen("http://".(strpos($ad['website'], "www")!==false ? "www." : "").$web_file, "r"); while(!@feof($f)){ $html.=@fgets($f); } @fclose($f); Changed to: $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $ad['website']); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 4); curl_setopt($ch, CURLOPT_TIMEOUT, ; $html = curl_exec($ch); Hopefully this helps someone out here when they have a similar problem Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.