Jump to content

PHP Dynamic Cache


justin7410

Recommended Posts

Hey Guys,

 

So Currently i have a website that was my first web development project ( learning how to program and web design ). That now has generated a good amount of traffic and users, and although i am not greatly educated in Computer Science, i am learning my way to getting a better product to my end users.

 

The problem:

 I see my bounce rate is not so well, and that my users are coming to a page that is slow in speed. The website handles images and text that are fed from queries to a database then spit out. 

 

Solution:

My Users were coming on a page, and each user was having to load the same page and data over and over for each user. I learned that using a technique called dynamic caching that i would be able to solve this problem, by storing the query and data, then being able to feed that content one time to all the users who are viewing the site. This would accelerate your web speed dramatically. 

 

So i understand what my problem is and what the solution is. Now my

 

Issue:

 

I don't fully understand how to implement this from what i keep reading and researching on. 

 

I do know your suppose to work with ob() function which would help with buffering on the website.

 

My problem is that from the beginning i was using ob_start() on the header of my website due to redirect issues when i would use header(). To be honest i didn't fully understand why using ob_start solved that issue but now i find myself having problems again with the error of : header has already been sent.

 

I tried looking into platforms like Varnish or phpfastcache which led to some issues due to my lack of OOP progamming skill and was not able to solve this issue.

 

I then tired to create a simple code that would create a cache for the page that the user was on. Store that data in tmp file then look for that file when a user first comes on, if they find that file in say 5 min span load that file, if not load a new one. 

 

All in all i think i am just failing at this due to my lack of understanding the core concept of what i am actually trying to do. 

 

what i have so far:

 

function cache_file() {
    // something to (hopefully) uniquely identify the resource
    $cache_key = md5($_SERVER['HTTP_HOST'] . $_SERVER['REQUEST_URI'] . $_SERVER['QUERY_STRING']);
    $cache_dir = '/tmp/phpcache';
$cache_combo = $cache_dir . '/' . $cache_key; 
    return $cache_combo;
 
}
 
$cache_file = cache_file();
 
echo $cache_file;

die();

 

RESULTING IN :

 

/tmp/phpcache/38bf4baa8867cb2feec947ac3c893fe9

 

 

ISSUE : I AM NOT RETURNING A QUERY STRING, I UNDERSTAND SOMETIMES ONE WONT BE PRESENT AND I AM NOT SURE IF THIS IS EVEN A PROBLEM.

 

So now each page is returning its own $cache_result

 

// if we have a cache file, deliver it
if( is_file( $cache_file = cache_file() ) ) {
readfile( $cache_file );
exit;
} else {

   echo 'NO FILE FOUND START A NEW 1';

}

 

ISSUE : NO FILE IS BEING FOUND SO THE CONDITIONAL IS NOT PASSING AS TRUE FOR is_file()

 

// cache via output buffering, with callback
ob_start( 'cache_output' );

 

ISSUE : UNLESS ob_start() IS THE FIRST THING OFF MY CODE I GET THE HEADER RELOCATE ERROR OF:

 

so if i have something like:

echo cache_file();

ob_start();

resulting in:

 

/tmp/phpcache/38bf4baa8867cb2feec947ac3c893fe9
Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent (output started at /home/justin/public_html/include/init.php:20) in/home/justin/public_html/include/init.php on line 27

 

 

 

//
// expensive processing happens here, along with page output.
//

 

Now at this point here i am guessing that i need to  return the temp file and the content specific to the data return of the cache. I have been struggling to get the cache to setup correctly i have not even dealt with this part yet. I am somewhat lost on what is meant by expensive processing, i am assuming this is the part that takes all the server speed and processing that is needed for you to host with speed.

 

All feedback and suggestions are very much appreciated, i am not so much looking for a answer per say, i would just like to get a better idea if i am on the right track , or what i am lacking or missing fundamentally or just code wise.

 

thanks guys

 

-Justin7410

 

Link to comment
Share on other sites

before you can attempt to fix a problem, you must find out what the cause of the problem is. everything you have mentioned won't help much if the page(s) themselves are outputting huge image/media files. caching could help with dynamically produced images, where any one image is reused.

 

what is the amount of html markup on a typical page and what is the total size of the image/media data? where are the images stored at (file system or in a database table)? are images dynamically being produces/manipulated?

Link to comment
Share on other sites

 Hey mac, 

 

thanks for the feedback, 

 

so let just say i am taking my index page, with about 60 images all dynamic in terms of content being updated by votes , ratiing , newly inserted. .. so these images many of them stay on the site for days , weeks, sometimes even never changing due to popularity.

 

45 of these images are from database sources from a table i have and those images are being pulled from an amazon web server.

 

15 of them are on file through the server

 

none of the images is larger than 30kb  ( logo ) and the avg is around 17- 18 KB.

 

in terms of total mock-up size of the page its about 3.24 MB

 

thanks again mac

Link to comment
Share on other sites

the expensive processing you asked/mentioned in your first post assumes you are doing a huge amount of processing (taking a second or more) to produce the markup on the page. at this point i actually doubt this is the case, but does your page have any sort of microtime() code determining how long it takes the page to be generated, and you are either displaying this on the page or logging the values?

 

if your page is taking a fraction of a second to be generated on the server, there's little to be gained by caching the resulting html of the page.

 

if your page is taking longer than a second to generate, the first step is to optimize the code, rather than to slap a band-aid on top of it to compensate for a long page generation time.

 

 

45 of these images are from database sources from a table i have and those images are being pulled from an amazon web server.

 

 

you need to be more specific about this. what is the typical processing, how many database queries are there? are you reading the images and storing them on your server, then serving them to the client by putting a url at your server into the markup or are you putting amazon's url for the image directly into the markup?

 

if you are putting amazon's url into the markup, and perhaps you are required to do so to satisfy their terms of service, there's actually nothing you can do to speed up the page loading since your site isn't involved at all with these images. it is the browser that is requesting the images from the url you are putting into the markup.

 

it doesn't sound like you are dynamically producing/manipulating images using GD functions? only using static images, which may be added/removed over time, but the actual image file exists as a real static file.

Link to comment
Share on other sites

the expensive processing you asked/mentioned in your first post assumes you are doing a huge amount of processing (taking a second or more) to produce the markup on the page. at this point i actually doubt this is the case, but does your page have any sort of microtime() code determining how long it takes the page to be generated, and you are either displaying this on the page or logging the values?

 

if your page is taking a fraction of a second to be generated on the server, there's little to be gained by caching the resulting html of the page.

 

if your page is taking longer than a second to generate, the first step is to optimize the code, rather than to slap a band-aid on top of it to compensate for a long page generation time.

 

 

 

i am getting these numbers logged by various website sources , the owner of the server doing times noticed that the website was running extremely slow compared to how the server was comparing, he himself told me the best thing to do would be to dynamically parse the data that is being viewed by multiple users at once. The website is running at 13 seconds compares to avg of 3-4 seconds . 

 

you need to be more specific about this. what is the typical processing, how many database queries are there? are you reading the images and storing them on your server, then serving them to the client by putting a url at your server into the markup or are you putting amazon's url for the image directly into the markup?

 

i am running 8 queries not including user signing in and grabbing user_data query for gathering sessions and user ID. 

 

no i have 15 logo or social media images, and content images that are being generated from amazon web are directly into markup via a function inserting md5 hash convert into url . but that is working at a very efficient speed other websites co-workers are running simultaneously. so i am certain it is my data being downloaded over and over. instead of server sending the stored cache or the website not even dealing with the server and just outputting the correct content 

Link to comment
Share on other sites

Hey Guys,

 

so i made some progress on what i am trying to accomplish here but i still have a couple issues at hand that i cant seem to figure out.

$current_file = explode('/', $_SERVER['SCRIPT_NAME']);
$current_file = end($current_file);
$timestamp = microtime();
$cache_dir = 'cache';
function cache_key($current_file,$cache_dir){

$pages_with_no_id = array('index.php');
$pages_with_id = array('watch.php','browse.php');

if(in_array($current_file, $pages_with_no_id)){
$cache_key = md5($_SERVER['HTTP_HOST'] . $_SERVER['REQUEST_URI'] . $_SERVER['QUERY_STRING']);
}elseif(in_array($current_file,$pages_with_id)){
$cache_key = md5($_SERVER['HTTP_HOST'] . $_SERVER['REQUEST_URI'] . $_GET);
}

$file = $cache_key. '.php';
return $file;
}
$file = cache_key($current_file,$cache_dir);
$dir_file  = $cache_dir .'/'.$file;
$include = '\''.$dir_file.'\'';
$handle = opendir('cache');
if(file_exists($dir_file)){
include $include;
} else {
$file_handler = fopen($dir_file,'w+') or die('error writting new file');
}
closedir($handle);

So now i have setup each page with a key and being set into my cache folder , now if a file exists in the folder already it will just include that file , if not then it should create that file from new.

 

My main issues are 2 things:

 

I want to now under the else condition when the new file has been created, i want to insert the content of the already created page into the cache page.

 

so essentially after:

$file_handler = fopen($dir_file,'w+') or die('error writting new file');

i want to then add something to this extent:

fwrite($file_handler, $content);

but i am not sure how to get the variable $content with the data of the entire page that i want cached;

 

my second issue is: How do i approach adding the timestamp to the content page so that i can match the timestamp to new caches or update cache. ?
 

Any Suggestions would be huge guys,

 

I am doing my best to get this solved, but any push would help.

Edited by justin7410
Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.