Jump to content

Recommended Posts

<?php
$orig = file('list.txt');
$new = array();
$new[] = 'dmy';
$slice_size = 8000000;
for($i=0;$i<count($orig);$i += $slice_size) {
   $new[] = array_slice($orig,$i,$slice_size);
}
for ($i=1;$i<count($new);++$i) {
file_put_contents('WordList ['.$i.'].txt',implode('',$new[$i]));
}
?>

 

I have text files and wordlists that have up to 250 million lines, I need this script to split the main file which is upwards to 800 MB, and split it into any size I specify, this script works for smaller files but It wont even split a file that is 6 million lines, ~58 mb

 

any help with this is appreciated

Link to comment
https://forums.phpfreaks.com/topic/240330-this-works-for-smaller-files-but/
Share on other sites

The problem is that file() loads all the lines in memory probably exceeding the assigned limit. To avoid this you should read line by line:

 

$fopen = fopen('list.txt', 'rb');
while ($fopen && ($line = fgets($fopen)) {
    // do something with $line
}

// all lines processed
if ($fopen) fclose($fopen);

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.