ezekielle Posted July 26, 2007 Share Posted July 26, 2007 I'm having trouble accessing urls from a remote server. For example: <?php $myFile = "http://www.google.com"; $c = fopen($myFile, "r"); echo $c; ?> This times out. I'm not sure how to configure a wrappers to allow or disallow reading from urls. I'm brand new to php, and I've spent a couple hours surfing, trying to figure this out. Any suggestions? Quote Link to comment Share on other sites More sharing options...
tomasd Posted July 26, 2007 Share Posted July 26, 2007 Not sure if that's what you what you want but anyway... <?php $ch = curl_init("http://www.example.com/"); $fp = fopen("example_homepage.txt", "w"); curl_setopt($ch, CURLOPT_FILE, $fp); curl_setopt($ch, CURLOPT_HEADER, 0); curl_exec($ch); curl_close($ch); fclose($fp); ?> Also look at curl manual http://php.net/curl Quote Link to comment Share on other sites More sharing options...
DeadEvil Posted July 26, 2007 Share Posted July 26, 2007 .. or you can use file_get_contents function <?php $myFile = "file_get_contents(http://www.google.com/some_files.ext"; echo $myFile; ?> Quote Link to comment Share on other sites More sharing options...
Danltn Posted July 27, 2007 Share Posted July 27, 2007 Preferred way if (function_exists('curl_init')) { $ch = curl_init($url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); @curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']); return curl_exec($ch); } else { return file_get_contents($url); } Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.