Jump to content

Recommended Posts

I am looking for suggestions/advice etc.

I currently have a custom blog script and some people have asked for the ability to subscribe to comments via RSS.

Now the logic behind this is what I'm stumbling with.

Would I offer a seperate feed for each topic's comments?  So I would have to set up a directory that would store these comment RSS feeds and then use the topic's ID as the feeds name and just write to it when I need to like I do with my main RSS feed?

I don't use wordpress so I am curious how wordpress would do it.

Am I following the right idea for going about this little task?

Yes I am sure people will comment on why I am 'reinventing the wheel' but scripting my own blog has been a fun challenge and I love doing it. :)
Link to comment
https://forums.phpfreaks.com/topic/34914-blog-site-comments-via-rss-feed/
Share on other sites

[quote author=SharkBait link=topic=123172.msg508771#msg508771 date=1169236784]
Would I offer a seperate feed for each topic's comments?  So I would have to set up a directory that would store these comment RSS feeds and then use the topic's ID as the feeds name and just write to it when I need to like I do with my main RSS feed?
[/quote]

That seems fine to me.

You have acounted for 'caching' in the most simple yet effective way. Although you way want to setup client/proxy caching. And yes, have seperate files for each topic, since that is what the user subscribed to, you don't want unnessecary traffic.

you could actually send the feed dynamically. this would just avoid the need to have so many files. so a RSS Feed URL could look something like this:

[quote]
http://www.mysite.com/feeds/myfeed.php?post_id=2
[/quote]

and the myfeed.php actually sends the necessary XML headers, constructs the feed (or loads it from cache, if you put this functionality in) and sends it out. I believe some feed readers are a little funny about non RSS/XML extensions (irrespective of headers sent), but nothing that a .htaccess file and a mod_rewrite rule couldnt sort out.
The number of files shouldn't be a consideration. When you do things right, you will never look at that dir, wheter it has 10 or 1000 files in it, who cares? If you we're to cache in different way, you'd have the same amount of files. If you are unable to setup apache to send the appropiate client/proxy caching headers, then yes, use a php file as an entry point.

For something like a feed, with the potential to generate many subsequential requests, I suggest you keep things as simple as possible.

In any other case I would recommend proper routing, but in this case I think you want to avoid as much logic as possible. Sure you can use a PHP file as an entry point, for example you could use it to delete 'dead' files, and regenerate them upon request (if you're concerned about disk space).

What I am saying in both this and the previous reply: KIS (keep it simple). Wheter you use a PHP file as an entry piont or not.

Avoid as much logic as you possibly can (that includes repeating logic across requests: caching).
i wont disagree with you on this one, 448191, as you're right too - using some sort of cache will inevitably create just as many files anyway. however, a few points (based on personal opinion and choice rather than solid facts):

- in the event of changing the format/structure of the feeds (which has/will/does change), only the generator would need changing.
- assuming your normal stance on frameworks, reusability, etc, a script that generates feeds on the fly (albeit with a caching mechanism) is going to be better and more re-usable than an app specific RSS feed that generate different files based specifically on comments.
- My personal preference: ANYTHING that needs changing regularly throughout a site is best kept in a database, or at least generated from a DB. The reason being is that all I need to do is backup the database - not the database AND files (ok, so an element of laziness in there too ;) ). Images and similar content are kinda unavoidable, unless you store them as BLOB's, but I really have issues against changing actual files of my sites if I can help it.

my £0.02 plus a Mars Bar.
[quote author=redbullmarky link=topic=123172.msg509250#msg509250 date=1169322842]
- assuming your normal stance on frameworks, reusability, etc, a script that generates feeds on the fly (albeit with a caching mechanism) is going to be better and more re-usable than an app specific RSS feed that generate different files based specifically on comments.
[/quote]

You'll always need code to create the RSS files. Yes, that should be reusable code. I don't see the need to store in a database though.

Look at the RSS files as a cache: do you backup your cache? Do you store your cache in a DB? Of course not. Regenerating the files when you move the site could be part of the normal automated setup process. Any change in the classses that render the RSS would take effect as soon as the cache expires, just like with a traditional cache.

When you have to apply restrictions on access, or mantain some state across requests; that's a whole different stroy. This case is also much different from decentralized [b]logic[/b]. Unless you use PHP as an accesspoint, in which case I agree: interagrate into your current design, use a traditional caching system, your normal system of routing requests. Otherwise you'll get a decentralized design and face issues with code managebilty.

Using the actual RSS files as entrypoint won't compromise code reusabilty. In fact, if you move away from this aproach, the generating and filewriting classes could be used in a PHP acesspoint scenario (if properly designed, with reusabilty in mind).

One thing I am a bit concerned about in SharkBait's approach is file access.

Another is 'features' vs 'simplicity'.

Anyway, maybe I'm a bit over the top on the whole 'avoid as much logic as possible'. I guess it's only really important on feeds that are in high demand - on a cheap server.. :P

An alternate solution (I'm working on something like that right now) could be to use a standard XSL file to transform your regular topic documents to RSS files, on the client. An linked javascript file should be able to transform it's parent document, which could be triggered by the hash section, i.e:

blog.com/topic/5421/#RSS

But creating RSS on the server is easier to manage of course.

Damn, I'm straying... :P
This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.