doublebassdanny Posted July 16, 2008 Share Posted July 16, 2008 Hey all, I am having some trouble with a small script that process a HUGE text file line by line. It will work perfectly for several thousand lines, but then fail for whatever reason. Could you take a look and see if there is something blatant I'm missing? I'm writing the output to a MySQL db and that works great, the only problem is that it fails. I have text files in the 5gb+ range. Here is the code: <?php $myFile = $_POST["filename"]; //it goes hostname, username, pass $con = mysql_connect("localhost","------","------"); if (!$con) { die('Could not connect: ' . mysql_error()); } mysql_select_db("md5", $con); $handle = @fopen($myFile . ".txt", "r"); if ($handle) { while (!feof($handle)) { $str = fgets($handle); $str = trim( preg_replace( '/\s+/', ' ', $str ) ); $hash = md5 ($str); mysql_query("INSERT INTO hashes (word, hash) VALUES ('$str', '$hash')"); echo $str . " (" . $hash . ") <b> ADDED! </b><br>"; } fclose($handle); } ?> Thanks all. Link to comment https://forums.phpfreaks.com/topic/115069-problem-processing-massive-text-files-line-by-line/ Share on other sites More sharing options...
doublebassdanny Posted July 16, 2008 Author Share Posted July 16, 2008 Oh I forgot to add: Is there a way to possibly speed up the insertion to the database. I'll be adding millions of records at a time, but they are all very small (8 characters), and I think my database could handle multiple connections. Right now I just want it to get from the top of the file to the bottom though. That's the first and main priority. Link to comment https://forums.phpfreaks.com/topic/115069-problem-processing-massive-text-files-line-by-line/#findComment-591754 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.