soadlink Posted February 4, 2007 Share Posted February 4, 2007 Hello, I'm looking for a way to read large lists word by word (will be done locally through command line) that are in a txt file. The lists will be anywhere from 2000 - 100000 words (but it should be able to handle any amount). Then something will be done with each word (specified by me), and then it will move on to the next word in the list. The list would be 1 word per line, like: word1 word2 word3 So it would be some kind of loop until the script is done going through the list. Also, it should not alter the list at all, such as removing the words or re-arranging them. I will change what happens with each word, but if you provide an example you can just echo the word and start a new line "\n". Thanks Link to comment https://forums.phpfreaks.com/topic/37069-solved-load-large-lists-for-processing/ Share on other sites More sharing options...
soadlink Posted February 4, 2007 Author Share Posted February 4, 2007 Nevermind, got it with a little more searching $lines = array_map('rtrim',file('file.txt')); foreach ($lines as $line) { echo $line; } I use the first line because it removes a blank line that the code $lines = file('c:\\list.txt'); adds to the end :-\, in case anyone wonders. Only way I knew how to remove that blank line. Link to comment https://forums.phpfreaks.com/topic/37069-solved-load-large-lists-for-processing/#findComment-177103 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.