sawade Posted October 5, 2009 Share Posted October 5, 2009 I am receiving three error messages repeatedly. And am not understanding how to fix it. Thank you for the help. Errors: [Mon Oct 05 14:17:07 2009] [notice] cannot use a full URL in a 401 ErrorDocument directive --- ignoring! [Mon Oct 05 14:17:07 2009] [warn] RewriteCond: NoCase option for non-regex pattern '-f' is not supported and will be ignored. [Mon Oct 05 14:22:16 2009] [error] [client 220.181.94.235] File does not exist: /usr/local/apache/htdocs/bbs Can't use string ("2.3") as a HASH ref while "strict refs" in use at /usr/local/apache/htdocs/404.cgi line 162. htaccess file: Options +FollowSymLinks # DISALLOW includes to execute code Options +includesNOEXEC # DISALLOW peek into directories without an index file Options -Indexes RewriteEngine On RewriteBase / AddHandler application/x-httpd-php5s .php # ERROR messages ErrorDocument 400 /400.shtml ErrorDocument 401 /401.shtml ErrorDocument 403 /403.shtml ErrorDocument 404 /404.shtml ErrorDocument 500 /500.shtml # REDIRECT if not www to www.domain.com RewriteCond %{HTTP_HOST} ^domain\.com$ RewriteRule ^/?$ http://www.domain.com/ [R=301,L] # REDIRECT w/ folder secureforms/forms/ to SSL RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} (/secureforms/forms//?)|(/secureforms/forms//.*)$ RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,QSA,R=301] # IF HTTPS is ON - REDIRECT away from HTTPS RewriteCond %{HTTPS} on RewriteCond %{REQUEST_URI} !(/secureforms/forms//?)|(/secureforms/forms//.*)$ RewriteRule .* http://%{HTTP_HOST}%{REQUEST_URI} [L,QSA,R=301] # REDIRECT access from secure folder to 403 Forbidden Redirect /home6/medsolut/php/secure http://www.domain.com/403.shtml <Files .htaccess> deny from all </Files> # BLOCK robots RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.* - [F,L] Quote Link to comment https://forums.phpfreaks.com/topic/176605-solved-apache-version-2213-unix-errors/ Share on other sites More sharing options...
sawade Posted October 14, 2009 Author Share Posted October 14, 2009 I guess I asked a good queston. Quote Link to comment https://forums.phpfreaks.com/topic/176605-solved-apache-version-2213-unix-errors/#findComment-936811 Share on other sites More sharing options...
corbin Posted October 14, 2009 Share Posted October 14, 2009 Pretty basic errors really. All quite self explanatory. "[Mon Oct 05 14:17:07 2009] [notice] cannot use a full URL in a 401 ErrorDocument directive --- ignoring!" Errr.... You can't use a full URL in a 401 ErrorDoc directive. Actually, your ErrorDocument statement for 401 is not using a full URL. Do you have any htaccess files anywhere using a full URL? [Mon Oct 05 14:17:07 2009] [warn] RewriteCond: NoCase option for non-regex pattern '-f' is not supported and will be ignored. Looks like it's also in a htaccess file. Basically it means that: RewriteCond <something> -f [NC] is not valid. Since file paths are case sensitive, there could be numerous ways that Apache could handle that. [Mon Oct 05 14:22:16 2009] [error] [client 220.181.94.235] File does not exist: /usr/local/apache/htdocs/bbs Users keep requesting /bbs/ which doesn't exist. Why they're requesting it, I have no idea. Perhaps you have a broken link somewhere. Can't use string ("2.3") as a HASH ref while "strict refs" in use at /usr/local/apache/htdocs/404.cgi line 162. No idea what that means. Would need to see 404.cgi. Quote Link to comment https://forums.phpfreaks.com/topic/176605-solved-apache-version-2213-unix-errors/#findComment-937013 Share on other sites More sharing options...
sawade Posted October 19, 2009 Author Share Posted October 19, 2009 Pretty basic errors really. All quite self explanatory. "[Mon Oct 05 14:17:07 2009] [notice] cannot use a full URL in a 401 ErrorDocument directive --- ignoring!" Errr.... You can't use a full URL in a 401 ErrorDoc directive. Actually, your ErrorDocument statement for 401 is not using a full URL. Do you have any htaccess files anywhere using a full URL? No. This is my only htaccess file. [Mon Oct 05 14:17:07 2009] [warn] RewriteCond: NoCase option for non-regex pattern '-f' is not supported and will be ignored. Looks like it's also in a htaccess file. Basically it means that: RewriteCond <something> -f [NC] is not valid. Since file paths are case sensitive, there could be numerous ways that Apache could handle that. Changed to this... RewriteCond %{HTTP_HOST} ^domain\.com$ [NC] RewriteRule ^/?$ http://www.domain.com/ [R=301,L] # REDIRECT w/ folder secureforms/forms/ to SSL RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} (/secureforms/forms//?)|(/secureforms/forms//.*)$ [NC] RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,QSA,R=301] # IF HTTPS is ON - REDIRECT away from HTTPS RewriteCond %{HTTPS} on RewriteCond %{REQUEST_URI} !(/secureforms/forms//?)|(/secureforms/forms//.*)$ [NC] RewriteRule .* http://%{HTTP_HOST}%{REQUEST_URI} [L,QSA,R=301] But am still getting the error message. [Mon Oct 05 14:22:16 2009] [error] [client 220.181.94.235] File does not exist: /usr/local/apache/htdocs/bbs Users keep requesting /bbs/ which doesn't exist. Why they're requesting it, I have no idea. Perhaps you have a broken link somewhere. Can't use string ("2.3") as a HASH ref while "strict refs" in use at /usr/local/apache/htdocs/404.cgi line 162. No idea what that means. Would need to see 404.cgi. We do not use CGI Could it be the hosting server htaccess conflicting with how we have our site setup? In which case, there would be no way to fix the error messages, correct? Thanks. Quote Link to comment https://forums.phpfreaks.com/topic/176605-solved-apache-version-2213-unix-errors/#findComment-939907 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.