ffdave77 Posted May 15, 2006 Share Posted May 15, 2006 I have several PHP scripted pages which have a high sensitivity for Security (all them require user name/password). The script basically checks for a cookie, if not present or expired it redirects your browser to the login page.My question is how do you protect those pages from being searched by web crawlers and search engines? All the sensitive pages exist in the same directory. Quote Link to comment Share on other sites More sharing options...
.josh Posted May 15, 2006 Share Posted May 15, 2006 create a text file called robots.txt and put it in your /public_html/put this in there:[code]User-agent: *Disallow: /[/code]edit: this is a quick, all encompessing statement ^ . robots will not be allowed to crawl any of the directories on your site. you can be more specific, like only disallow certain directories or files, as well. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.