-
Posts
3,372 -
Joined
-
Last visited
-
Days Won
18
Everything posted by Muddy_Funster
-
almost, I have re-written your code. It uses a few functions you may not have come across - if you don't know them please learn them before using them in your own code - have a look at this, it should work: <?php // Make a MySQL Connection mysql_connect("127.0.0.1", "root", "") or die(mysql_error()); mysql_select_db("users") or die(mysql_error()); $result = mysql_query("SELECT comments FROM users"); while($row = mysql_fetch_assoc($result)){ if(!isset($dataString)){ $dataString = $row['comments']; } else{ $dataString .= "<br><br>{$row['comments']}"; } } $page = <<<PAGE <!DOCTYPE html> <html> <head> <title></title> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> </head> <body> <form name="input" action="" method="get"> <div id="commentsr"> $dataString </div> <textarea id="myMessage" name="comments" cols="30" rows="10" style="width: 400px; height: 83px;"></textarea> <input type="submit" value="Submit"> </form> </body> </html> PAGE; echo $page; ?>
-
yeah...you posted this in the wrong board, but perhaps this link will help http://www.mustbebuilt.co.uk/2012/07/27/adding-form-fields-dynamically-with-jquery/
-
mysql_query() returns a resource, not a dataset. You need to use mysql_fetch_assoc() to get the dataset. as for putting the comment into the database - there is nothing there that would perform an insert query to add the information, you're missing at least two thirds of the code you need to have to do what you want.
-
How to pass values from one page to another
Muddy_Funster replied to subhomoy's topic in PHP Coding Help
you could echo an AJAX request using the php script to populate the variables from the $_GET array on the current page to subload a section of the page using the data. Although as vinny said, it's not really the best use of resource. if you want the code in the include to use the data on the current page all you should need to do is include the file it's self, no url data needed. Here's the AJAX anyway. $ajax = <<<AJAX_STR <script type="text/javascript"> $(document).ready(function(){ $.ajax({ url: "your_url.php?value={$_GET['value']}", type: "get", success: function(data){ $('#displayDIVid').html(data); }, error: function(){ alert("Some items could not be displayed."); } }); }); </script> AJAX_STR; echo $ajax; -
How to pass values from one page to another
Muddy_Funster replied to subhomoy's topic in PHP Coding Help
the warning kinda says it all - the file name does not exist. you can't use require or include to pass variables into files like that because at the point that the require/include is called they are not being sent as variables, but as a literal part of the file name. You need to parse the variables through a url so that the http protocol passes the header info needed and the script has access to the variable information that you are trying to send. What exactly is your objective here? and have you ever used cURL or written and JQuery/javascript before? -
How to pass values from one page to another
Muddy_Funster replied to subhomoy's topic in PHP Coding Help
And the error is......!? hmm...needs a drum-roll I think..... -
every string value that you are sending to the database server should be run through mysql_real_escape_sting(). You should also have basic sense checking in place to make sure that values exist, have a practical length, and are indeed of an expected format. Also you will need to sanitize numerical values on your own. Also guys, it would be nice if you could both read the forum rules and start using code tags around all you code postings.
-
your problem is that you are not sanitizing your input strings. thus your apostrophe is breaking the string that is being sent to the sql server. it's a far bigger problem than you think as it means you are wide open to SQL injection. I'll assume you are using the mysql_ librarary - as such you should be running every input string through mysql_real_escape_string() before sending it through your query. if you happen to be using PDO or mysqli_ then you should be using prepared statements.
-
Which PHP Editor do you think is the best? [v2]
Muddy_Funster replied to Daniel0's topic in Miscellaneous
I agree, I don't think "what's the best" is an accurate question when it comes to code editors/IDE's. What is a feature to one group can be a pain in the backside for another. I don't think that there is a universal right answer, although there may be, I have used a pitiful number of applications to code in. I do like it simple myself, and have been prone to using PSPad for a number of yeas now. I drifted into using eclipse for a little while, but setting up workspaces and projects was too much of a pain in the backside after a while and I just gave up. I also couldn't be bothered with the load up time of starting eclipse every time I wanted to go in and do a bit of coding. I have just recently started using an IDE/editor called CodeLobster, and I have to say, I'm quite liking it. It doesn't suffer from PSpads single most infuriating habit of auto closing absolutely everything as soon as you open it. i.e. you type a double quote and instantly you have two on the page, one in front of the cursor, and one behind. It drives me nuts. Code Lobster also has an integrated debugger - which I haven't trued yet, but if you set up your webroot it lets you preview pages in a single click, without having to change window and hit f5. Like other environments it lets you change the color theme in as few as 4 clicks (choices include all the popular editors), offering a preview of how each theme looks on each type of code (CSS/PHP/HTML/etc.). All in, it's shaping up to be a pretty polished bit of software. The basic version is free with the option to spend a nominal amount to upgrade to lite or a bit more to get pro. I was going to go into the differences but read through it and it looked like a rather shameless advert, so I deleted it and I'll leave it there. -
try change your file sting - it's a windows machine, so use the normal windows file system directory delimit's $file = "C:\\wamp\\www\\NetOptUI2\\input.txt"; also, you never close any of your input tags in the html, that's probably unrelated, but bad coding none the less I don't see anywhere that $_REQUEST['savedata'] is being set to "1".
-
Finding bad words from large list of email addressess
Muddy_Funster replied to abhilashss's topic in PHP Coding Help
what possible legitimate reason could you have to be holding almost 1 million unverified email addresses? -
may we ask what's broken about it?
-
Not able to insert a record mySQL database
Muddy_Funster replied to RameshPeriya's topic in MySQL Help
what does the mysql_error() return for said problem? -
Robot Detection Class - For Your Use
Muddy_Funster replied to Muddy_Funster's topic in Beta Test Your Stuff!
Thanks again kicken, I have made the changes you suggested. Also, I have added a sort on the $this->exceptions array as it was failing the hash check if the same values were entered as the previous run but in a different order. /** * Generates a list of robot useragent deffinitions for use with * $_SERVER['HTTP_USER_AGENT'] to identify robots * * A Huge Thank You to Psycho, Kicken and Thorpe @ forums.phpfreaks.com * for their help and advice. * * This links into the robotstext.org site to access thier current * robot list. It then produces an array of these useragents that * can be used to check if a visitor is a robot or not. * Call: $yourVar = new getRobots(); * Setter : $yourVar->setExclude(mixed $mixed) * Getter : $robotArray = $yourVar->makeBots; * $yourVar->exclude(mixed $mixed); - send values to be excluded. * Accepts either an array of values or a single string vlaue * JSON output (if you want to pass to javascript): echo $yourVar; * * -------------------------------------------------------------- * @example 1 : PHP BOT Check * * $bots = new getRobots; * $bots->setExclude(array("", "none", "no", "yes")); * $bots->makeBotList(); * $botArray = $bots->robots; * * if(!in_array($_SERVER('HTTP_USER_AGENT'), $botArray){ * import_request_variables("g", "user_"); //example of something to do * ... * ... * } * else{ * echo "Bot Safe Site Visited"; //example of something to do * ... * ... * } * ------------------------------------------------------------- * @example 2 : output to JSON * * $bots = new getRobots; * $bots->setExclude(""); * $bots->setExclude("none"); * $bots->setExclude("???"); * $bots->setExclude("no"); * $bots->setExclude("yes"); * $bots->makeBotList(); * * header("Content-type: application/json"); * echo $bots; * exit; * ----------------------------------------------------------- * * @param array $robots the array list of useragents * @param array $excludes array of exlusions from the bot list * @param string $url static url value for linking to the * @param string $lfPath path to generate subfolder to store cache files in * @param string $masterFile path to master cache file of robotstxt.org data * @param string $botFile path to cached bot file for qicker repeat array building * @param string $mdCheckFile path to md5Checksum cache to establish if cached bot file can be used * @param array $hashVals generated md5 values from current call * @param array $hashFileVals values from md5 checksum cache file use for comparison * @param string $output contents retrieved from robotstxt.org site * @return array getBots() returns array of robot user aganents * @return string __toString() Returns JSON string of Object{"robots":array[{"numericalKey":"useragentText"}] */ class getRobots{ public $robots; public $excludes; private $url; private $lfPath; private $masterFile; private $botFile; private $mdCheckFile; private $hashVals; private $hashFileVals; private $output; public function __construct(){ $this->url = "http://www.robotstxt.org/db/all.txt"; $this->lfPath= substr(__FILE__,0,strripos(__FILE__,'\\')+1).'robots'; $this->masterFile= $this->lfPath.'\\rbtList.txt'; $this->botFile = $this->lfPath."\\allBots.txt"; $this->mdCheckFile = $this->lfPath."\\mdHashFile.txt"; $this->excludes[] = "Due to a deficiency in Java it's not currently possible to set the User-Agent."; $this->excludes[] = "Due to a deficiency in Java it's not currently possible"; if(!is_dir($this->lfPath)){ if(!mkdir($this->lfPath)){ throw new RuntimeException("error creating directory! PHP must have write permissions for this folder -- $lfPath"); } } } public function setExclude($mixed){ $mixed = (array)$mixed; $this->excludes = array_merge($this->excludes, $mixed); $this->excludes = array_unique($this->excludes); sort($this->excludes); } public function makeBots(){ $this->checkFile(); $this->checkBotList(); } private function checkFile(){ if (file_exists($this->masterFile)){ $mtime = filemtime($this->masterFile); $ctx = stream_context_create(array( 'http' => array( 'header' => "If-modified-since: ".gmdate(DATE_RFC1123, $mtime) ) )); } else { $ctx = stream_context_create(); } $fp = fopen($this->url, 'rb', false, $ctx); $this->output = stream_get_contents($fp); $meta = stream_get_meta_data($fp); if (strpos($meta['wrapper_data'][0], ' 200 ') !== false){ file_put_contents($this->masterFile, $this->output); } fclose($fp); } private function checkBotList(){ $robots = array(); $this->hashVals[0] = md5(implode("|",$this->excludes)); if(!file_exists($this->mdCheckFile)){ $fileVals = explode("\n",$this->output); } else{ $this->hashFileVals = file($this->mdCheckFile); if(trim($this->hashVals[0]) == trim($this->hashFileVals[0])){ $this->robots = file($this->botFile); } else{ $fileVals = file($this->masterFile); } } if(isset($fileVals)){ foreach ($fileVals as $line=>$text){ if (strpos($text, "robot-useragent:") !== FALSE){ $robots[] = trim(substr($text,16)); } } $filterRobs = array_diff($robots, $this->excludes); $filterRobs = array_unique($filterRobs); $this->robots = $filterRobs; $botOut = implode("\n", $filterRobs); $botHandle = fopen($this->botFile, 'w'); fwrite($botHandle, $botOut); fclose($botHandle); $this->hashVals[1] = md5(implode("|", $filterRobs)); $difCheck = array_diff($this->hashVals, (array)$this->hashFileVals); if(count($difCheck) >= 1){ $writeback = implode("\n", $this->hashVals); $mdHandle = fopen($this->mdCheckFile, 'w'); fwrite($mdHandle, $writeback); } } } public function __toString(){ return json_encode(array('robots' => $this->robots)); } }- 14 replies
-
Create Real Estate Report With Demographics and Maps
Muddy_Funster replied to bankman3000's topic in PHP Coding Help
If you have never programmed before then I have to say - you're aiming too high. The only direction I can honestly give you is to a collage/other learning establishment that you can take some classes in PHP web application development. You need to get a sound grounding in the core principles of programming and in the languages that you are using before you can even think about taking on something the likes of what you have described. The project above looks on the surface like its a four figure development cost, and that's being conservative. That's not something you can just walk off the street and code. -
Robot Detection Class - For Your Use
Muddy_Funster replied to Muddy_Funster's topic in Beta Test Your Stuff!
OK guys, I think I have this nailed now. I have changed the setExclude as you suggested Psycho, and have tested it in all ways I expect to use it. I totally re-wrote the cached file checking because the first time round I made a total mess of it. Anyway I have updated the DocBlock as best as I can work it out (shouldn't be too far off) and what started as the idea of having a check to feed only SEO content to bots and to only render forms to actual users has, with a not insignificant amount of help, turned into this : /** * Generates a list of robot useragent deffinitions for use with * $_SERVER['HTTP_USER_AGENT'] to identify robots * * A Huge Thank You to Psycho, Kicken and Thorpe @ forums.phpfreaks.com * for their help and advice. * * This links into the robotstext.org site to access thier current * robot list. It then produces an array of these useragents that * can be used to check if a visitor is a robot or not. * Call: $yourVar = new getRobots(); * Setter : $yourVar->setExclude(mixed $mixed) * Getter : $robotArray = $yourVar->getBots; * $yourVar->exclude(mixed $mixed); - send values to be excluded. * Accepts either an array of values or a single string vlaue * JSON output (if you want to pass to javascript): echo $yourVar; * * * @param array $robots the array list of useragents * @param array $excludes array of exlusions from the bot list * @param string $url static url value for linking to the * @param string $lfPath path to generate subfolder to store cache files in * @param string $masterFile path to master cache file of robotstxt.org data * @param string $botFile path to cached bot file for qicker repeat array building * @param string $mdCheckFile path to md5Checksum cache to establish if cached bot file can be used * @param array $hashVals generated md5 values from current call * @param array $hashFileVals values from md5 checksum cache file use for comparison * @param string $output contents retrieved from robotstxt.org site * @return array getBots() returns array of robot user aganents * @return string __toString() Returns JSON string of Object{"robots":array[{"numericalKey":"useragentText"}] */ class getRobots{ public $robots; public $excludes; private $url; private $lfPath; private $masterFile; private $botFile; private $mdCheckFile; private $hashVals; private $hashFileVals; private $output; public function __construct(){ $this->url = "http://www.robotstxt.org/db/all.txt"; $this->lfPath= substr(__FILE__,0,strripos(__FILE__,'\\')+1).'robots'; $this->masterFile= $this->lfPath.'\\rbtList.txt'; $this->botFile = $this->lfPath."\\allBots.txt"; $this->mdCheckFile = $this->lfPath."\\mdHashFile.txt"; $this->excludes[] = "Due to a deficiency in Java it's not currently possible to set the User-Agent."; $this->excludes[] = "Due to a deficiency in Java it's not currently possible"; if(!is_dir($this->lfPath)){ if(!mkdir($this->lfPath)){ echo "error creating directory! PHP must have write permissions for this folder -- $lfPath"; return false; exit; } } } public function setExclude($mixed){ if(!is_array($mixed)){ $mixed = (array)$mixed; } $this->excludes = array_merge($this->excludes, $mixed); $this->excludes = array_unique($this->excludes); } public function getBots(){ $this->checkFile(); $this->checkBotList(); } private function checkFile(){ if (file_exists($this->masterFile)){ $mtime = filemtime($this->masterFile); $ctx = stream_context_create(array( 'http' => array( 'header' => "If-modified-since: ".gmdate(DATE_RFC1123, $mtime) ) )); } else { $ctx = stream_context_create(); } $fp = fopen("http://www.robotstxt.org/db/all.txt", 'rb', false, $ctx); $this->output = stream_get_contents($fp); $this->checkBotList(); $meta = stream_get_meta_data($fp); if (strpos($meta['wrapper_data'][0], ' 200 ') !== false){ file_put_contents($this->masterFile, $this->output); } fclose($fp); } private function checkBotList(){ $robots = array(); $this->hashVals[0] = md5(implode("|",$this->excludes)); if(!file_exists($this->mdCheckFile)){ $fileVals = explode("\n",$this->output); } else{ $this->hashFileVals = file($this->mdCheckFile); if(trim($this->hashVals[0]) == trim($this->hashFileVals[0])){ $this->robots = file($this->botFile); } else{ $fileVals = file($this->masterFile); } } if(isset($fileVals)){ foreach ($fileVals as $line=>$text){ if (strpos($text, "robot-useragent:") !== FALSE){ $robots[] = trim(substr($text,16)); } } $filterRobs = array_diff($robots, $this->excludes); $filterRobs = array_unique($filterRobs); $this->robots = $filterRobs; $botOut = implode("\n", $filterRobs); $botHandle = fopen($this->botFile, 'w'); fwrite($botHandle, $botOut); fclose($botHandle); $this->hashVals[1] = md5(implode("|", $filterRobs)); $difCheck = array_diff($this->hashVals, (array)$this->hashFileVals); if(count($difCheck) >= 1){ $writeback = implode("\n", $this->hashVals); $mdHandle = fopen($this->mdCheckFile, 'w'); fwrite($mdHandle, $writeback); } } } public function __toString(){ $json = "{\"robots\":[".json_encode($this->robots)."]}"; return $json; } } If there is anything else let me know, and thanks again guys.- 14 replies
-
Robot Detection Class - For Your Use
Muddy_Funster replied to Muddy_Funster's topic in Beta Test Your Stuff!
OK, been having a go at making the changes suggested. Now I have to confess to getting a little bit lost in it all and I'm not sure if this is actually where I should have went with it, but here is what I am sitting with at the moment. It creates the files on first run, and doesn't seem to update them if the conditions all match, but I'm not exactly sure if I have it set up right to handle partial changes : <?php /** * Generates a list of robot useragent deffinitions for use with * $_SERVER['HTTP_USER_AGENT'] to identify robots * * This links into the robotstext.org site to access thier current * robot list. It then produces an arrau of these useragents that * can be used to check if a visitor is a robot or not. * Call: $yourVar = new getRobots(); * $robotArray = $yourVar->robots; * $yourVar->exclude(mixed $mixed); - send values to be excluded. * Accepts either an array of values or a single string vlaue * JSON output (if you want to pass to javascript): echo $yourVar; * * * @param array $robots the array list of useragents * @return __toString Returns JSON string of Object{"robots":array[{"numericalKey":"useragentText"}] */ class getRobots{ public $robots; public $excludes; private $url; private $lfPath; private $lfFile; private $hashVals; private $output; public function __construct(){ $this->url = "http://www.robotstxt.org/db/all.txt"; $this->lfPath= substr(__FILE__,0,strripos(__FILE__,'\\')+1).'robots'; $this->lfFile= '\\rbtList.txt'; $this->excludes[] = "Due to a deficiency in Java it's not currently possible to set the User-Agent."; $this->excludes[] = "Due to a deficiency in Java it's not currently possible"; if(!is_dir($this->lfPath)){ if(!mkdir($this->lfPath)){ echo "error creating directory! PHP must have write permissions for this folder -- $lfPath"; return false; exit; } } } public function setExclude($mixed){ if(is_array($mixed)){ foreach($mixed as $key=>$toExclude){ $this->excludes[] = $toExclude; } } else{ $this->excludes[] = $mixed; } } public function getBots(){ $this->checkHashes(); $this->robots = file($this->lfPath."\\justBots.txt"); } private function checkFile(){ if (file_exists($this->lfPath.$this->lfFile)){ $mtime = filemtime($this->lfPath.$this->lfFile); $ctx = stream_context_create(array( 'http' => array( 'header' => "If-modified-since: ".gmdate(DATE_RFC1123, $mtime) ) )); } else { $ctx = stream_context_create(); } $fp = fopen("http://www.robotstxt.org/db/all.txt", 'rb', false, $ctx); $this->output = stream_get_contents($fp); $this->checkBotList(); $meta = stream_get_meta_data($fp); if (strpos($meta['wrapper_data'][0], ' 200 ') !== false){ file_put_contents($this->lfPath.$this->lfFile, $this->output); } fclose($fp); } private function checkBotList(){ if(!empty($this->output)){ $oEx = explode("\n", $this->output); } else{ $oEx = file($this->lfPath."\\justBots.txt"); } foreach ($oEx as $key=>$line){ if(strpos($line, 'robot-useragent:') !== FALSE){ $robots[] = trim(substr($line, 16)); } } if(isset($robots)){ foreach($this->excludes as $exclude){ foreach(array_keys($robots, $exclude) as $key){ $drop[] = $key; } } foreach($drop AS $idx){ unset($robots[$idx]); } array_unique($robots); $bf = fopen($this->lfPath."\\justBots.txt",'w'); $bots = implode("\n", $robots); $this->hashVals[1] = md5($bots); fwrite($bf,$bots); fclose($bf); } } private function checkHashes(){ $this->hashVals[0] = md5(implode("\n", $this->excludes)); if(!file_exists($this->lfPath.'\\mdHashFile.txt')){ $this->checkFile(); $hf = fopen($this->lfPath."\\mdHashFile.txt",'w'); $hashOut = implode("\n", $this->hashVals); fwrite($hf, $hashOut); } else{ $hfVals = file($this->lfPath."\\mdHashFile.txt"); if($hfVals[0] != $this->hashVals[0]){ $this->checkBotList(); } else{ $this->robots = file($this->lfPath."\\justBots.txt"); } } } public function __toString(){ $json = "{\"robots\":[".json_encode($this->robots)."]}"; return $json; } } $robo = new getRobots(); $robo->setExclude(array("", "no", "yes","null")); $robo->getBots(); var_dump($robo->robots); ?> Oh - and I haven't updated the DocBloc yet...going to wait untill it's accurate. What do you guys think? How wide of the mark am I?- 14 replies
-
Robot Detection Class - For Your Use
Muddy_Funster replied to Muddy_Funster's topic in Beta Test Your Stuff!
@ kicken - Thanks for that, that's given me some reading to do, I haven't even looked at stream_ at all, so that's going to take some looking into. @ psycho - I must have really screwed the code on the if/else - the IF was for if the $mixed was an array, it tries to match the array value from $mixed with the current iteration of $this->robots and drops it if it matches. The ELSE was to match the current iteration against the value of $mixed as a direct string comparison, allowing $robots->exclude($mixed) to be sent a single string or an array. I'll definitely get in about it tomorrow and post up something taking your suggestions on board. Thanks for the time and help guys, I appreciate it and will try to make the most of your comments.- 14 replies
-
Robot Detection Class - For Your Use
Muddy_Funster replied to Muddy_Funster's topic in Beta Test Your Stuff!
Hey Psycho, thanks for taking the time to pass that feedback. I appreciate it. I'll get on making some of the changes tomorrow morning once I'm back in the office. In answer to some of what you said - I wrote the class with the intention of passing a flag into session on pass, so the check would only be run once per agent / per visit. It was such a tiny difference calling against the remote site V's a local copy I decided just to access the remote each time, plus being totally honest, I wouldn't know how to make a check on the files (local V remote) any more efficient than to load up the remote file each time anyway. I will add some commenting to explain the exclude array better, I will confess I do tend to under comment code I write for myself. The else in the runExclusion is there so that it can take either an array or a single string as input. I did test this and it worked fine (at least I certainly thought that it did) though I will revisit it tomorrow and examine exactly what it's doing. I am very interested in your suggestion of using the checksum, although I have never done anything using that kind of check. I'll look into it, as soon as I get some time and see if I cant refactor the script to use it some time in the not to distant future. I went down the explode / rebuild route as when I first started it looked like I was simply dealing with a tab delimited list, that turned out not to be the case as it was simply using spaces and not tabs. I just instantly fell into the thought that "I want to split a string based on a delimiter" and wound up changing the code to fit the thought process rather than the other way round. I will have a shot at the strpos() idea as it does sound cleaner. I'll also look into apply the array_filter that you said. Using both the runExclusions and exclude methods instead of just using the exclude method.....to be honest looking at it now I don't even know what I was thinking, I'll get that sorted in the morning. The pre tag is just my carelessness, no excuses on that one, so it will go as soon as I get back in front of the code. And I'll swap out the array_push for the use of $array[] at the same time. Once again, thanks for taking the time to read over the code and comment on it. Without the crit it (and I) wouldn't get any better.- 14 replies
-
How to create simple CMS without database
Muddy_Funster replied to Rita_Ruah's topic in Application Design
why would you not want to use a database? Without a database you are going to be depandant on flat tile storage and filesystem IO commands. -
Robot Detection Class - For Your Use
Muddy_Funster replied to Muddy_Funster's topic in Beta Test Your Stuff!
OK, here's the revision: <?php /** * Generates a list of robot useragent deffinitions for use with * $_SERVER['HTTP_USER_AGENT'] to identify robots * * This links into the robotstext.org site to access thier current * robot list. It then produces an arrau of these useragents that * can be used to check if a visitor is a robot or not. * Call: $yourVar = new getRobots(); * $robotArray = $yourVar->robots; * $yourVar->exclude(mixed $mixed); - send values to be excluded. * Accepts either an array of values or a single string vlaue * JSON output (if you want to pass to javascript): echo $yourVar; * * * @param array $robots the array list of useragents * @return __toString Returns JSON string of Object{"robots":array[{"numericalKey":"useragentText"}] */ class getRobots{ public $robots=array(); public function __construct($url = "http://www.robotstxt.org/db/all.txt"){ $fullList = file($url); $exlusions = array //default exclusion list ( "Due to a deficiency in Java it's not currently possible to set the User-Agent.", "Due to a deficiency in Java it's not currently possible", ); echo "<pre>"; foreach ($fullList as $line=>$content){ $delimit = ":"; $split = explode($delimit, $content); if(trim($split['0']) == "robot-useragent"){ $conCount = count($split); $agent = ""; for($i=0;$i<$conCount;$i++){ if($i != 0){ $conPart = $i; $agent .= " {$split[$conPart]} "; } } array_push( $this->robots, trim($agent)); } } $this->runExclusion($exlusions); } public function exclude($mixed){ $this->runExclusion($mixed); } private function runExclusion($mixed){ if(is_array($mixed)){ foreach($this->robots as $key=>$agent){ if(in_array(trim($agent), $mixed)){ unset($this->robots[$key]); } } } else{ foreach($this->robots as $key=>$agent){ if(trim($agent) == trim($mixed)){ unset($this->robots[$key]); } } } } public function __toString(){ $json = "{\"robots\":[".json_encode($this->robots)."]}"; return $json; } } $robo = new getRobots(); $robo->exclude(array("", "no", "None", "???", "no", "yes" )); var_dump($robo); ?> Any other/new comments?- 14 replies
-
Robot Detection Class - For Your Use
Muddy_Funster replied to Muddy_Funster's topic in Beta Test Your Stuff!
hmm...there's more than a little bit of truth there, let me revise.- 14 replies
-
I have knocked up this liitle class for retrieving a list of known robot user agents from the really rather helpfull people over at robotstxt.org. It pulls info from their site and builds an array that can be used to compare against the $_SERVER['HTTP_USER_AGENT'] varable. It has an exlusion array that can be altered to suit your personal prefferences and can be echoed directly to produce a valid JSON string that can be passed as is to a JQuery/Javascript using AJAX or anything of that like. I am putting no restrictions on this, but the people over at the robotstxt.org do request that you give them a mention for accessing their data, so I leave that up to anyone who may want to use it. Anyway, I found the need to be able to ensure bots didn't get free reign of the site I was making and thought that some other people out there may have a use for this. Here it is, enjoy (maybe) - anyway let me know what you guys think of it. (p.s. - I'm new to the whole DocBlock thing... ) <?php /** * Generates a list of robot useragent deffinitions for use with * $_SERVER['HTTP_USER_AGENT'] to identify robots * * This links into the robotstext.org site to access thier current * robot list. It then produces an arrau of these useragents that * can be used to check if a visitor is a robot or not. * Call: $yourVar = new getRobots(); * $robotArray = $yourVar->robots; * * JSON output (if you want to pass to javascript): echo $yourVar; * * * @param string $url Link to robotstxt.org server * @param array $robots the array list of useragents * @return __toString Returns JSON string of Object{"robots":array[{"numericalKey":"useragentText"}] */ class getRobots{ public $url; public $robots=array(); public function __construct() $url = "http://www.robotstxt.org/db/all.txt"{ $fullList = file($url); $exlusions = array //add lines here to include exclusions for any other agents in the list ( "", "no", "Due to a deficiency in Java it's not currently possible to set the User-Agent.", "???", "no", "yes" ); echo "<pre>"; foreach ($fullList as $line=>$content){ $delimit = ":"; $split = explode($delimit, $content); if(trim($split['0']) == "robot-useragent"){ $conCount = count($split); $agent = ""; for($i=0;$i<$conCount;$i++){ if($i != 0){ $conPart = $i; $agent .= " {$split[$conPart]} "; } } array_push( $this->robots, trim($agent)); } } foreach($this->robots as $key=>$agent){ if(in_array($agent, $exlusions)){ unset($this->robots[$key]); } } } public function __toString(){ $json = "{\"robots\":[".json_encode($this->robots)."]}"; return $json; } } ?>
- 14 replies
-
people don't generaly blacklist your server just because you have a php mail script. They normally do it because your script is insecure / poorly coded and is being either exploited or missfired. Show us your code.
-
I get that you think the relationship is a complex one, that's why I asked you to run a describe on them and let us see what's there. If it's proving to be this convoluted to perform a simple insert then it's most likely that the design needs some work. one possible way could be to get the max id, but that is well south of being a reliable solution. INSERT INTO dataRecSpec (J_RefNum, DRS_Name) VALUES(SELECT MAX(job_ID) FROM jobTableName, '$text') But I really recomend against using this. I don't get why mysql_insert_id() is returning 0 if you are sure the last query was the insert into the job table - it's only suppsed to return 0 if there was no auto_inc id generated by the last query. As I say - I think this has become such a headache becase of a design flaw which we could possibly work to fix.