Jump to content



  • Please log in to reply
1 reply to this topic

#1 ffdave77

  • New Members
  • Pip
  • Newbie
  • 4 posts

Posted 15 May 2006 - 12:53 AM

I have several PHP scripted pages which have a high sensitivity for Security (all them require user name/password). The script basically checks for a cookie, if not present or expired it redirects your browser to the login page.

My question is how do you protect those pages from being searched by web crawlers and search engines? All the sensitive pages exist in the same directory.

#2 .josh

  • Staff Alumni
  • .josh
  • 14,871 posts

Posted 15 May 2006 - 01:05 AM

create a text file called robots.txt and put it in your /public_html/

put this in there:
User-agent: *
Disallow: /
edit: this is a quick, all encompessing statement ^ . robots will not be allowed to crawl any of the directories on your site. you can be more specific, like only disallow certain directories or files, as well.
Did I help you? Feeling generous? Buy me lunch! 
Please, take the time and do some research and find out how much it would have cost you to get your help from a decent paid-for source. A "roll-of-the-dice" freelancer will charge you $5-$15/hr. A decent entry level freelancer will charge you around $15-30/hr. A professional will charge you anywhere from $50-$100/hr. An agency will charge anywhere from $100-$250/hr. Think about all this when soliciting for help here. Think about how much money you are making from the work you are asking for help on. No, we do not expect you to pay for the help given here, but donating a few bucks is a fraction of the cost of what you would have paid, shows your appreciation, helps motivate people to keep offering help without the pricetag, and helps make this a higher quality free-help community :)

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users