Jump to content



  • Please log in to reply
1 reply to this topic

#1 ffdave77

  • New Members
  • Pip
  • Newbie
  • 4 posts

Posted 02 June 2006 - 04:29 AM

I am interested in knowing how to securely block search bots and other web bots from reaching sensitive areas of my web site. I know that if you put a .htaccess file in a directory no web bot can touch that area. How to I impliment a similar setting or happening if I don't want .htaccess and have my own login script?

All my pages have a script in the first lines of the document which check for login status from a cookie and if you are not logged in you are forced (through the header) to a login page).

Any suggestions on this security issue? Do I need to clearify anything?

#2 poirot

  • Members
  • PipPipPip
  • Advanced Member
  • 646 posts
  • LocationAustin, TX

Posted 02 June 2006 - 04:33 AM

You may want to use robots.txt
[a href=\"http://www.robotstxt.org/wc/robots.html\" target=\"_blank\"]http://www.robotstxt.org/wc/robots.html[/a]

Or the meta robots:
[a href=\"http://www.robotstxt.org/wc/meta-user.html\" target=\"_blank\"]http://www.robotstxt.org/wc/meta-user.html[/a]
~ D Kuang

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users