Jump to content

json_encode question...


cowboysdude

Recommended Posts

include('simple_html_dom.php');
// Create DOM from URL or file
$html = file_get_html('http://sportsfeedia.com/nfl/');
$curl = curl_init(); 
curl_setopt($curl, CURLOPT_URL, $html);  
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);  
curl_setopt($curl, CURLOPT_CONNECTTIMEOUT, 10);  
$str = curl_exec($curl);  
curl_close($curl);

// Find all links, and their text
foreach($html->find('a[class=itemtitle]') as $elm) {
	
	$link=$elm->href;
   $text=$elm->plaintext;
   	$news= str_replace("new","*",$text);
   	
?>
<?= $news ?><BR>
 <a href=<"<?= $link ?>">Read More</a><br>
 
<?php
 }

That's what I have so far and it's working great... however I've been trying all day to get the contents $link=$elm->href;

$text=$elm->plaintext;  and json_encode them to data.json so that I don't have to hit the server each time I or a user gets onto the site...

 

I don't want to be a resource hog here... would like to be a polite about it.   So I can check it 3 to 4 times a day rather then 100 times a day... That's just rude.

 

So any help would be very much appreciated!!!

 

Thanks

John 

Link to comment
https://forums.phpfreaks.com/topic/295872-json_encode-question/
Share on other sites

You can store the data in a text file. You would only perform the curl request if the file does not exists or when a certain time period has passed (eg every 6 hours).  

 

Example code

<?php

define('UPDATE_INTERVAL', 60*60*6); // set update interval, eg 6 hours
define('DATAFILE', 'data.json');    // file to store the data in

$timeSinceUpdated = 0;

// if the datafile exists
if(file_exists(DATAFILE))
{
	// get the time when the datfile was last updated/created
	$lastModified = filemtime(DATAFILE);
	// get the time passed since the file has been updated/created
	$timeSinceUpdated = time() - $lastModified;

	// decode the data from the datafile
	$data = file_get_contents(DATAFILE);
	$data = json_decode($data);
}

// if the file does not exist, or the the update interval has passed
// perform curl request and update the data
if(!file_exists(DATAFILE) || $timeSinceUpdated >= UPDATE_INTERVAL)
{
	include('simplehtmldom_1_5/simple_html_dom.php');
	// Create DOM from URL or file
	$html = file_get_html('http://sportsfeedia.com/nfl/');
	$curl = curl_init(); 
	curl_setopt($curl, CURLOPT_URL, $html);  
	curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);  
	curl_setopt($curl, CURLOPT_CONNECTTIMEOUT, 10);  
	$str = curl_exec($curl);  
	curl_close($curl);

	// overwrite the data
	$data = array();
	// Find all links, and their text
	foreach($html->find('a[class=itemtitle]') as $elm)
	{
		$link = $elm->href;
   		$text = $elm->plaintext;
   		$news = str_replace("new","*",$text);

   		// add new data to array
   		$data[] = array($link, $text, $news);
   	}

   	// json encode the data and save to the datafile
   	file_put_contents(DATAFILE, json_encode($data));
}

// output links
foreach($data as $contents)
{
	list($link, $text, $news) = $contents;
?>
<?= $news ?><BR>
 <a href=<"<?= $link ?>">Read More</a><br>
<?php
}
?>

 

You can store the data in a text file. You would only perform the curl request if the file does not exists or when a certain time period has passed (eg every 6 hours).  

 

Example code

<?php

define('UPDATE_INTERVAL', 60*60*6); // set update interval, eg 6 hours
define('DATAFILE', 'data.json');    // file to store the data in

$timeSinceUpdated = 0;

// if the datafile exists
if(file_exists(DATAFILE))
{
	// get the time when the datfile was last updated/created
	$lastModified = filemtime(DATAFILE);
	// get the time passed since the file has been updated/created
	$timeSinceUpdated = time() - $lastModified;

	// decode the data from the datafile
	$data = file_get_contents(DATAFILE);
	$data = json_decode($data);
}

// if the file does not exist, or the the update interval has passed
// perform curl request and update the data
if(!file_exists(DATAFILE) || $timeSinceUpdated >= UPDATE_INTERVAL)
{
	include('simplehtmldom_1_5/simple_html_dom.php');
	// Create DOM from URL or file
	$html = file_get_html('http://sportsfeedia.com/nfl/');
	$curl = curl_init(); 
	curl_setopt($curl, CURLOPT_URL, $html);  
	curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);  
	curl_setopt($curl, CURLOPT_CONNECTTIMEOUT, 10);  
	$str = curl_exec($curl);  
	curl_close($curl);

	// overwrite the data
	$data = array();
	// Find all links, and their text
	foreach($html->find('a[class=itemtitle]') as $elm)
	{
		$link = $elm->href;
   		$text = $elm->plaintext;
   		$news = str_replace("new","*",$text);

   		// add new data to array
   		$data[] = array($link, $text, $news);
   	}

   	// json encode the data and save to the datafile
   	file_put_contents(DATAFILE, json_encode($data));
}

// output links
foreach($data as $contents)
{
	list($link, $text, $news) = $contents;
?>
<?= $news ?><BR>
 <a href=<"<?= $link ?>">Read More</a><br>
<?php
}
?>

WOW.. that was way beyond what I was thinking and probably why I was having such a difficult time with it!!!  Thank you so much!!  

 

It now loads instantly!!!  Speeds things right up besides not being a jerk and hitting the server each time!!!

 

It really solved more then 1 problem!  When I was trying to store it it would only store the 1st record NOW it's storing what I want... I will study this because I've run across this issue in other scripts I've tried to write... 

 

The other was I was trying to write another foreach statement trying to get it to store all the data... NOW it does.  I see what it does here.. gets the info stores it, then opens the file instead of hitting the server for display!  

 

I was just thinking this morning that there's more then one way to skin a cat... you just showed me I was right!! I guess with programming you limit your mind, you limit your possibilities!! Learning is fun and I really like it... 

 

NOW I have to learn mysql storage so I can do pagination...  So far everything I've looked at looks difficult because I'm new and security issues are always a concern.

 

Thank you so very much for your time and patience!!  

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.