Jump to content

sawade

Members
  • Posts

    181
  • Joined

  • Last visited

    Never

Everything posted by sawade

  1. I changed the way my form tells the user of an error. But I can't seem to get it to work properly. Any help would be great. Thanks. // IF NO errors process form if ($error == "") { . . code to email form . } else { // List errors echo '<p class="error">'$error'</p>'; $output_form = true; } Error messages look like this.. if (empty($first_name) && empty($last_name)) { // IF first and last are empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME and LAST NAME.<br />'; $output_form = true; } if (empty($first_name)) { // IF first only is empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME.<br />'; $output_form = true; }
  2. So your code looks like this? $to = $email . ', ' ; $to .= FORM_MAILER; $headers["From"] = MY EMAIL; $headers["Subject"] = "LALALA $date"; $smtp["host"] = SMTP_HOST; $smtp["port"] = SMTP_PORT; $smtp["auth"] = true; $smtp["username"] = SMTP_USERNAME; $smtp["password"] = SMTP_PASSWORD; $msg = "$date\n"; $msg = wordwrap($msg, 70); $mail = Mail::factory('smtp', $smtp); $mail->send($to, $headers, $msg) or die('Error accessing SMTP server.');
  3. I am receiving three error messages repeatedly. And am not understanding how to fix it. Thank you for the help. Errors: [Mon Oct 05 14:17:07 2009] [notice] cannot use a full URL in a 401 ErrorDocument directive --- ignoring! [Mon Oct 05 14:17:07 2009] [warn] RewriteCond: NoCase option for non-regex pattern '-f' is not supported and will be ignored. [Mon Oct 05 14:22:16 2009] [error] [client 220.181.94.235] File does not exist: /usr/local/apache/htdocs/bbs Can't use string ("2.3") as a HASH ref while "strict refs" in use at /usr/local/apache/htdocs/404.cgi line 162. htaccess file: Options +FollowSymLinks # DISALLOW includes to execute code Options +includesNOEXEC # DISALLOW peek into directories without an index file Options -Indexes RewriteEngine On RewriteBase / AddHandler application/x-httpd-php5s .php # ERROR messages ErrorDocument 400 /400.shtml ErrorDocument 401 /401.shtml ErrorDocument 403 /403.shtml ErrorDocument 404 /404.shtml ErrorDocument 500 /500.shtml # REDIRECT if not www to www.domain.com RewriteCond %{HTTP_HOST} ^domain\.com$ RewriteRule ^/?$ http://www.domain.com/ [R=301,L] # REDIRECT w/ folder secureforms/forms/ to SSL RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} (/secureforms/forms//?)|(/secureforms/forms//.*)$ RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,QSA,R=301] # IF HTTPS is ON - REDIRECT away from HTTPS RewriteCond %{HTTPS} on RewriteCond %{REQUEST_URI} !(/secureforms/forms//?)|(/secureforms/forms//.*)$ RewriteRule .* http://%{HTTP_HOST}%{REQUEST_URI} [L,QSA,R=301] # REDIRECT access from secure folder to 403 Forbidden Redirect /home6/medsolut/php/secure http://www.domain.com/403.shtml <Files .htaccess> deny from all </Files> # BLOCK robots RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.* - [F,L]
  4. For the undefined index... continue what Alex said previously. Include all variables under the isset(). That should take care of those for you.
  5. LOL You know. It never ceases to amaze me how something so small can totally mess up the whole thing. Chalk it up as a Homer Simpson... D'ohhh!!! After hours of brooding... here was issue... if (empty($first_name) && empty($last_name)) { // IF first and last are empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME and LAST NAME.<br />'; $output_form = true; } if (empty($first_name)) { // IF first only is empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME.<br />'; $output_form = true; } else { if (!empty($first_name) && strlen($first_name) < 2 || strlen($first_name) > 30) { // IF first is not empty and is not between 2 and 30 characters $error .= 'FIRST NAME must be between 2 and 30 characters.<br />'; $output_form = true; } } else { if (!empty($first_name) && !ctype_alpha($first_name)) { // IF first is not empty and contains illegal characters $error .= 'FIRST NAME contains illegal characters.<br />'; $output_form = true; } } I was thinking it was the ctype, but really it's the if else statements. They should be as follows... if (empty($first_name) && empty($last_name)) { // IF first and last are empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME and LAST NAME.<br />'; $output_form = true; } if (empty($first_name)) { // IF first only is empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME.<br />'; $output_form = true; } if (!empty($first_name) && strlen($first_name) < 2 || strlen($first_name) > 30) { // IF first is not empty and is not between 2 and 30 characters $error .= 'FIRST NAME must be between 2 and 30 characters.<br />'; $output_form = true; } if (!empty($first_name) && !ctype_alpha($first_name)) { // IF first is not empty and contains illegal characters $error .= 'FIRST NAME contains illegal characters.<br />'; $output_form = true; } Well that's what I get for trying to get it scripted out too quickly. Better to take your time and do it right the first time. LOL Thanks all for the assistance.
  6. Yes. if (!empty($first_name) && !ctype_alpha($first_name)) { Any use of the ctype_*() function creates the 500 error. But does not send an error to the php log or any of my other error logs. All I get is the internal server error. If I remove the ctype the script runs like clockwork. I have sent an email to my host server, thinking maybe with their master config. they do not allow this function. Should this prove to be true... anything ideas of an alternate function to use?
  7. Those lines are only in the file during testing, then are removed. I have my php.ini file set up for logging errors not showing them. And my error logs are completely empty. Nothing is being passed into them.
  8. I am doing just that. I get the error as soon as the code for the ctype_*() functions are input. My server is running the most up-to-date php version... why could this be happening?? Thanks.
  9. Gerry, For my advice. mail() functions can get a little confusing... especially when you go into trying to attachments. If your server allows php, they shouldn't have a problem with you installing some pear packages. I would recommend using the mail factory that pear has. EXCELLENT! I love it. Especially since I can define my SMTP and get attachments done much easier. For me PEAR has been way less stressful that mail(). Good luck.
  10. You can not nest block elements. Think about it like reading a book, you don't put a paragraph inside another paragraph. You put sentences inside a paragraph, sentences representing inline elements. With that... table is block, a is inline. So <table> <tr> <td><a href="">text</a></td> </tr> </table> You may want to move your styles into CSS either external or internal, whatever your preferece.
  11. Hello all. Well I am stumped. Profoundly stumped. I took a current php script I wrote previously and revamped it. Using a few new functions that hadn't been used in the first script. I can view my first script fine on the internet, however when I try to debug/test this new scipt my server gives me a 500 error. I have checked my php.ini and there is nothing in it that I can see that should be causing a conflict. So... I am asking for help. Below is the new code... <?php ini_set("display_errors", "1"); error_reporting(E_ALL); //error_reporting(E_ALL ^ E_NOTICE); session_start(); setlocale(LC_ALL, ''); ?> HTML head tag... etc... if (isset($_POST['submit'])) { $error = '';//initialize $error to blank ... lists additional variables Here is an example of the strings I added... if (empty($first_name) && empty($last_name)) { // IF first and last are empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME and LAST NAME.<br />'; $output_form = true; } if (empty($first_name)) { // IF first only is empty - REQUIRED FIELD $error .= 'Please input your FIRST NAME.<br />'; $output_form = true; } else { if (!empty($first_name) && strlen($first_name) < 2 || strlen($first_name) > 30) { // IF first is not empty and is not between 2 and 30 characters $error .= 'FIRST NAME must be between 2 and 30 characters.<br />'; $output_form = true; } } else { if (!empty($first_name) && !ctype_alpha($first_name)) { // IF first is not empty and contains illegal characters $error .= 'FIRST NAME contains illegal characters.<br />'; $output_form = true; } } etc... several hundred lines of code in between... } else { $output_form = true; } if ($error=='') { //IF NO errors process form ...email address validator ...captcah validator ... if all clear runs the email script ...echo confirmation } else{ echo '<p class="error">'$error'</p>';// List errors } } else { // Email not valid $error .= 'Please input a VALID EMAIL ADDRESS.<br />'; $output_form = true; } } else { // CAPTCHA not valid $error .= 'Please enter the VERIFICATION PASS-PHRASE exactly as shown.<br />'; $output_form = true; } } if ($output_form == true) { ?> <div id="form"> <form method="post" action="<?php echo htmlspecialchars($_SERVER['PHP_SELF']); ?>"> <p id="required">*Indicates a required field</p> Additional HTML code for form.
  12. Yes. I always use User-agent: * and Disallow: / and I have no problems with being found in search engines.
  13. Here is a good example. User-agent: Googlebot Disallow: / User-agent: googlebot-image Disallow: / User-agent: googlebot-mobile Disallow: / User-agent: MSNBot Disallow: / User-agent: Slurp Disallow: / User-agent: Teoma Disallow: / User-agent: twiceler Disallow: / User-agent: Gigabot Disallow: / User-agent: Scrubby Disallow: / User-agent: Robozilla Disallow: / User-agent: Nutch Disallow: / User-agent: ia_archiver Disallow: / User-agent: baiduspider Disallow: / User-agent: naverbot Disallow: / User-agent: yeti Disallow: / User-agent: yahoo-mmcrawler Disallow: / User-agent: psbot Disallow: / User-agent: asterias Disallow: / User-agent: yahoo-blogs/v3.9 Disallow: / User-agent: * Disallow: / Disallow: /cgi-bin/ Using the Disallow: / will keep google and like search engines from the root, this is true. So do use with caution. If you are looking to limit down where they look, it may be easier to list the folders and pages you want to disallow instead of allowing, example... User-agent: * Disallow: /folder Allow: /folder/page.html More info can be found here... http://www.robotstxt.org/
  14. To allow Google: User-agent: Google Disallow:
  15. Wrong place. LOL. Disregard previous post.
  16. To allow Google... User-agent: Google Disallow:
  17. I appreciate the help. You've been great.
  18. We will be creating something like this in the future. Is why we IP log everything on the site.
  19. Okay. First you do understand that robots.txt is a way to get rid of "good robots" from going through your files and finding things sensitive. It will not dissuade "bad robots". All this will do is tell the bot where it can and can not crawl. It is usually better to not allow crawling anywhere. User-agent: * Disallow: / This is a good thing. Now if you are worried about indexing issues... use the meta tags. Example:: For the index page <meta name="ROBOTS" content="INDEX, NOFOLLOW" /> For the rest of the pages/as you see fit: <meta name="ROBOTS" content="NOINDEX, NOFOLLOW" /> These obviously go in the <head> tags of your web page. Does this help?
  20. My thing is why is it only attacking this form? All the forms use this validation.
  21. So are you trying to allow index.php or not? I am comfused. You have disallowed everything in the root, then are listing specific files to allow. Have you included robots into your meta tags?
  22. Interesting. I don't think I need to worry about it, in the future these forms will be moved into a login area, that along with the captcha should remove any threats from spam. Wouldn't you think?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.