Jump to content

Efficiency of using php and MYSQL to generate page vs having "cached" copies.


physaux

Recommended Posts

Hey guys, I was hoping someone here could advise me on a matter...

 

I am going to have a website with roughly 300 pages, and they will have mostly static sections (unless a power user edits it, like wikipedia kind of), and some dynamic parts (like login area, time, notices, etc).

 

I was wondering, do you think I should develop some sort of "cache" system, where I process the page everytime it is edited or 14 hours or what not, then when somebody loads it, I just "include" the cached data, and attach the dynamic php content.

 

Would doing this be more efficient than generating the whole page from mysql for each user?

Thanks!

Well, I want it to be "scalable", but how much traffic, hmm I would say about 1.4 million views per day, spread out on 30,000 different pages. So about 50 view per page

 

How is that?

And how would I go about caching? Care to point me to a resource if it is apporpriate?

thanks

I found this code, anyone care to comment/ advise about it?


<?php
if ((is_file($_SERVER['SCRIPT_FILENAME'].'.cached'))
    && (time()-filemtime($_SERVER['SCRIPT_FILENAME'].'.cached') < 3600))
    {
    readfile($_SERVER['SCRIPT_FILENAME'].'.cached');
    exit;
    }

// (the php script itself goes here)

echo $out;
$fp = fopen($_SERVER['SCRIPT_FILENAME'].'.cached', 'w');
fwrite($fp, $out);
fclose($fp);

?>

 

Thanks!

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.