XGCommander Posted April 26, 2010 Share Posted April 26, 2010 I just finished a php caching mechanism for my website and the last hurdle I need to overcome is bots like google triggering a cache on every page. How do I go about detecting if a specific user is a bot so that I can prevent the caching script from triggering for them? Link to comment https://forums.phpfreaks.com/topic/199754-preventing-search-engine-bots-from-triggering-a-specific-php-script/ Share on other sites More sharing options...
de.monkeyz Posted April 26, 2010 Share Posted April 26, 2010 You'll want to have a look at robots.txt - http://en.wikipedia.org/wiki/Robots_exclusion_standard Link to comment https://forums.phpfreaks.com/topic/199754-preventing-search-engine-bots-from-triggering-a-specific-php-script/#findComment-1048449 Share on other sites More sharing options...
XGCommander Posted April 26, 2010 Author Share Posted April 26, 2010 I don't see how robots.txt can prevent a certain php file from being included, or a specific part of a file from being executed. Link to comment https://forums.phpfreaks.com/topic/199754-preventing-search-engine-bots-from-triggering-a-specific-php-script/#findComment-1048458 Share on other sites More sharing options...
de.monkeyz Posted April 26, 2010 Share Posted April 26, 2010 If you are only wanting to target portions of php which are included within larger files you still want to be crawled. The only thing I can think of is getting the array from get_browser and checking the browser value against something like this: http://www.useragentstring.com/pages/useragentstring.php But that's likely to be compromised. If you have php files which will execute something you don't want a crawler to activate, you're best off disabling the entire page with robots.txt Link to comment https://forums.phpfreaks.com/topic/199754-preventing-search-engine-bots-from-triggering-a-specific-php-script/#findComment-1048465 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.