Jump to content

Archived

This topic is now archived and is closed to further replies.

hegtv

Caching of highly dynamic content

Recommended Posts

I'm working on a php forum as hobby project more than anything, just to see what I can come up with.  But speed and efficiency are still somewhat of a concern to me.  I've read numerous articles advocating the caching of database content and providing or detailing various classes to do it.  All of the classes or other caching methods I have seem seem to involve either specific content life lengths, or else specific times of day when content may change.

Now in the context of a forum, data could change very rapidly, possibly from one second to the next.  But, the vast majority of content would likely be the same for at least 10 or some queries if not significantly more.  So it doesn't make sense to be re-getting a majority of the page's data when it has not changed.  My question is, is there a good way implement a caching system that would cache any data that has not changed from the last query but fetch the new updated data? All examples that I have seen involve the caching of an entire recordset, so without making seperate queries for each row (which I can imagine would  go beyond defeating my purpose of increased efficiency), I cannot think of a good way to pull this off.

I have browsed through the code a few popular open source php based forums and haven't seen anything pertaining to caching, but their codebases are vast and I may have looked in the wrong place.  However, I would assume that if they do not implement any caching, then there is no way practical way to do what I am asking.

So my question is, is there a practical way to cache highly dynamic forum data in the way I asked (or in any manner beneficial to the forums performance)?

Thanks

Share this post


Link to post
Share on other sites
There are literally hundreds of different ways to accomplish this.  Obviously, you can't update "half" of the info without checking if the "rest" is "old", hence the problem.  My first suggestion would be to consider whether you really need 1-second granularity... if not, you get generate static pages on a regular basis, and serve those, and re-generate them often, but not on demand.  Of course, the question is really whether or not you even need to bother.  MySQL has a query cache that often reduces overhead quite substantially.

Share this post


Link to post
Share on other sites

×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.