Stooney Posted May 13, 2009 Share Posted May 13, 2009 I currently have the method listed below for checking if a URL is valid. What I need is the fastest possible method of checking of a sizable array of URLs are valid or not. Here is what I have: <?php if(file_get_contents('http://us.php.net/images/php.gif', 'FILE_TEXT', NULL, 0, 1)){ echo 'ok'; } else{ echo 'fail'; } ?> Is there anything faster than this? Note: I don't need help with making it work, just need to know if there is a faster method than my file_get_contents way of doing it. Thanks in advance. Link to comment https://forums.phpfreaks.com/topic/157933-check-if-url-is-valid-fastest-method/ Share on other sites More sharing options...
ILMV Posted May 13, 2009 Share Posted May 13, 2009 Right, when you say valid do you mean it is in a valid format or that the web page exists? Link to comment https://forums.phpfreaks.com/topic/157933-check-if-url-is-valid-fastest-method/#findComment-833053 Share on other sites More sharing options...
JonnoTheDev Posted May 13, 2009 Share Posted May 13, 2009 Bad method - file_get_contents() is slow If you want to check that the webpage exists then I would check the HTTP header. i.e if you get no header the site maybe dead. You may also get a 404 header for an invalid url. <?php // return header function httpHeader($url) { $urlParts = parse_url($url); $fp = @fsockopen($urlParts['host'],80,$errno,$errstr,20); $out = "GET ".$url." HTTP/1.1\r\n"; $out.= "Host: ".$urlParts['host']."\r\n"; $out.= "Connection: Close\r\n\r\n"; @fwrite($fp,$out); $content = @fgets($fp); return $content; } print httpHeader("http://google.com")."\n"; ?> Link to comment https://forums.phpfreaks.com/topic/157933-check-if-url-is-valid-fastest-method/#findComment-833138 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.