Jump to content

Security


ffdave77

Recommended Posts

I have several PHP scripted pages which have a high sensitivity for Security (all them require user name/password). The script basically checks for a cookie, if not present or expired it redirects your browser to the login page.

My question is how do you protect those pages from being searched by web crawlers and search engines? All the sensitive pages exist in the same directory.
Link to comment
https://forums.phpfreaks.com/topic/9680-security/
Share on other sites

create a text file called robots.txt and put it in your /public_html/

put this in there:
[code]
User-agent: *
Disallow: /
[/code]
edit: this is a quick, all encompessing statement ^ . robots will not be allowed to crawl any of the directories on your site. you can be more specific, like only disallow certain directories or files, as well.
Link to comment
https://forums.phpfreaks.com/topic/9680-security/#findComment-35826
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.