bibliovermis Posted June 16, 2009 Share Posted June 16, 2009 The problem URL is [ http://www.neobux.com/rel/bl/?o=D4B44A9F3179942FCED0E440FF71F64F828DC2990AD18F9E ]. It can be accessed via a browser, however using file_get_contents returns the following error after timing out: "failed to open stream: HTTP request failed!" I've tried using cURL to mimic a browser, with no success. The URL loads perfectly fine on a machine that has never visited the neobux site, so it isn't a tracking issue of some type. Thank you for any insight into this perplexing issue. Link to comment https://forums.phpfreaks.com/topic/162357-solved-file_get_contents-cant-access-a-url-that-a-browser-can/ Share on other sites More sharing options...
trq Posted June 16, 2009 Share Posted June 16, 2009 Likely url wrappers are disabled within your php configuration. Take a look at your php.ini file. Link to comment https://forums.phpfreaks.com/topic/162357-solved-file_get_contents-cant-access-a-url-that-a-browser-can/#findComment-856945 Share on other sites More sharing options...
bibliovermis Posted June 16, 2009 Author Share Posted June 16, 2009 Likely url wrappers are disabled within your php configuration. My apologies for not clarifying that file_get_contents works for any other URL. I did try explicitly turning on the allow_url_fopen flag via htaccess & using the full fopen routine, with no change in result. Link to comment https://forums.phpfreaks.com/topic/162357-solved-file_get_contents-cant-access-a-url-that-a-browser-can/#findComment-856950 Share on other sites More sharing options...
bibliovermis Posted June 17, 2009 Author Share Posted June 17, 2009 After further testing, no part of the neobux.com is accessible by script / browser emulator. http://www.hashemian.com/tools/browser-simulator.htm Has anybody ever seen a site before that is perfectly functional in a browser and non-existent to a browser emulator? Link to comment https://forums.phpfreaks.com/topic/162357-solved-file_get_contents-cant-access-a-url-that-a-browser-can/#findComment-857730 Share on other sites More sharing options...
xphoid Posted June 17, 2009 Share Posted June 17, 2009 They appear to be using a fairly restrictive proxy server. (header shows Server:proxyshield-proxy4) I did manage to get the page contents using curl mimicking Firefox. $header = array(); $header[] = 'Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5'; $header[] = 'Cache-Control: max-age=0'; $header[] = 'Connection: keep-alive'; $header[] = 'Keep-Alive: 300'; $header[] = 'Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7'; $header[] = 'Accept-Language: en-us,en;q=0.5'; $header[] = 'Pragma: '; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, 'http://www.neobux.com/rel/bl/?o=D4B44A9F3179942FCED0E440FF71F64F828DC2990AD18F9E'); curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.11) Gecko/2009060215 Firefox/3.0.11 (.NET CLR 3.5.30729)'); curl_setopt($ch, CURLOPT_HTTPHEADER, $header); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_ENCODING, ''); curl_setopt($ch, CURLOPT_TIMEOUT, 20); $result = curl_exec($ch); curl_close ($ch); echo $result; Link to comment https://forums.phpfreaks.com/topic/162357-solved-file_get_contents-cant-access-a-url-that-a-browser-can/#findComment-857753 Share on other sites More sharing options...
MockY Posted August 23, 2011 Share Posted August 23, 2011 I know it's a very old thread, but I would just like to thank xphoid for providing me with a solution to my current problem. You saved my rear end. Link to comment https://forums.phpfreaks.com/topic/162357-solved-file_get_contents-cant-access-a-url-that-a-browser-can/#findComment-1261062 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.