Jump to content

Recommended Posts

How can a PHP-based website download pages off of other websites (news-related websites, for example) and store them in a temporary location, retrieving part of the file and displaying it within a page?

 

I did a quick search for HTTP get but didn't get any results. (I programmed AutoIt before PHP, if that's any help.)

Link to comment
https://forums.phpfreaks.com/topic/90700-http-download-from-website/
Share on other sites

Pretty much all of the function for handling local files will work with remote files too. Here is an example :

 

<?php
  function getTitle($url) {
    $expire_time = 15 * 60; //15 minutes
    $temp_file = '.tmp_'.md5($url); //Make unique filename

    //Check cache
    if(is_file($temp_file) && filemtime($temp_file) + $expire_time > time())
      return file_get_contents($temp_file);

    //Get contents
    $contents = file_get_contents($url);
    
    //Find title
    if(!preg_match("/<title>(.+)<\/title>/",$contents,$matches))
      return false;
      
    //Update cache
    file_put_contents($temp_file,$matches[1]);

    return $matches[1];
  }
  print getTitle("http://www.phpfreaks.com/forums/index.php");
?>

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.