physaux Posted January 2, 2010 Share Posted January 2, 2010 Hey guys, I'm trying to open a file, and save it to however many files, with $numlines lines each. Can anyone advise me the most "efficient way to do it" I am thinking of... -reading it line by line with a counter, then saving the files into pieces. Is that a efficient way of doing it? Because I will be dealing with 200k+ lines in a single text file, and I don't want to blow up my shared server, or get a "timeout" on my page. Thanks! Link to comment https://forums.phpfreaks.com/topic/186898-trying-to-read-a-text-file-and-save-it-as-x-files-all-y-lines-long/ Share on other sites More sharing options...
RussellReal Posted January 2, 2010 Share Posted January 2, 2010 you're better off doing it the way you're planning on doing it, otherwise you'd be calling fread 200k+ times instead of just buffering the whole content and working it out.. BUT what I can say is.. close every file resource you create or that will set flags off on your webhosting that you'd using too much cpu time and they'll lock your account.. Link to comment https://forums.phpfreaks.com/topic/186898-trying-to-read-a-text-file-and-save-it-as-x-files-all-y-lines-long/#findComment-987004 Share on other sites More sharing options...
sasa Posted January 2, 2010 Share Posted January 2, 2010 on my netbook this code <?php $t = microtime(1); $test = file('tolstoy.txt');//65k lines $test = array_merge($test, $test); $test = array_merge($test, $test); $test = array_chunk($test, 1000); $name = 'part'; $i = 0; foreach ($test as $part){ $i++; $f = fopen('part'.$i.'.txt', 'w'); $part = implode("\n",$part); fwrite($f, $part); fclose($f); } echo microtime(1)-$t," $i"; ?> output 1.6787929534912 262 Link to comment https://forums.phpfreaks.com/topic/186898-trying-to-read-a-text-file-and-save-it-as-x-files-all-y-lines-long/#findComment-987036 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.