Jump to content

Preventing Search Engine Bots from triggering a specific php script


XGCommander

Recommended Posts

I just finished a php caching mechanism for my website and the last hurdle I need to overcome is bots like google triggering a cache on every page. How do I go about detecting if a specific user is a bot so that I can prevent the caching script from triggering for them?

If you are only wanting to target portions of php which are included within larger files you still want to be crawled. The only thing I can think of is getting the array from get_browser and checking the browser value against something like this: http://www.useragentstring.com/pages/useragentstring.php

 

But that's likely to be compromised. If you have php files which will execute something you don't want a crawler to activate, you're best off disabling the entire page with robots.txt

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.