primetch Posted February 12, 2008 Share Posted February 12, 2008 How can a PHP-based website download pages off of other websites (news-related websites, for example) and store them in a temporary location, retrieving part of the file and displaying it within a page? I did a quick search for HTTP get but didn't get any results. (I programmed AutoIt before PHP, if that's any help.) Link to comment https://forums.phpfreaks.com/topic/90700-http-download-from-website/ Share on other sites More sharing options...
trq Posted February 12, 2008 Share Posted February 12, 2008 You can simply fetch files into a variable using file_get_contents(). Link to comment https://forums.phpfreaks.com/topic/90700-http-download-from-website/#findComment-464994 Share on other sites More sharing options...
rhodesa Posted February 12, 2008 Share Posted February 12, 2008 Pretty much all of the function for handling local files will work with remote files too. Here is an example : <?php function getTitle($url) { $expire_time = 15 * 60; //15 minutes $temp_file = '.tmp_'.md5($url); //Make unique filename //Check cache if(is_file($temp_file) && filemtime($temp_file) + $expire_time > time()) return file_get_contents($temp_file); //Get contents $contents = file_get_contents($url); //Find title if(!preg_match("/<title>(.+)<\/title>/",$contents,$matches)) return false; //Update cache file_put_contents($temp_file,$matches[1]); return $matches[1]; } print getTitle("http://www.phpfreaks.com/forums/index.php"); ?> Link to comment https://forums.phpfreaks.com/topic/90700-http-download-from-website/#findComment-465006 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.