BSlepkov Posted September 28, 2008 Share Posted September 28, 2008 Assuming I've already blocked undesirable bots, I now want to identify 'friendly' crawler or spider type activity from a normal surfer so as to not offer links for resorting lists - i.e. http://mywebsite.com/dir1/content_list.php?sort=date&order=asc Such a script would set a flag variable when client 123.45.678.90 accesses X number of pages within X number of seconds. Could this be done without having to use sessions or cookies? Link to comment https://forums.phpfreaks.com/topic/126150-different-navigational-links-for-friendly-crawlersspiders-than-for-visitors/ Share on other sites More sharing options...
wildteen88 Posted September 28, 2008 Share Posted September 28, 2008 Wouldn't you be better of looking in to mod_rewrite so you can use SEO friendly urls eg http://mywebsite.com/dir1/content_list/date/asc That way you urls are friendly for both your visitors and bots. Link to comment https://forums.phpfreaks.com/topic/126150-different-navigational-links-for-friendly-crawlersspiders-than-for-visitors/#findComment-652301 Share on other sites More sharing options...
BSlepkov Posted September 29, 2008 Author Share Posted September 29, 2008 Sorry wildteen, I assumed made it clear from my opening line to the post, that I already do block undesirables. I just didn't clarify how ... using mod_rewrites.The example url I gave is a mock legitimate link that for anyone other than spiders, I could just present as regular text. Link to comment https://forums.phpfreaks.com/topic/126150-different-navigational-links-for-friendly-crawlersspiders-than-for-visitors/#findComment-652801 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.