Jump to content

Reading a ridiculously large file (~2 600 000 lines)


Goldeneye

Recommended Posts

My objective consists of converting a flat-file into a MySQL-file.

 

My problem occurs while trying to read all the contents of a file via the fgets function. I get the usual: "Fatal error: Out of memory (allocated 387973120) (tried to allocate 129212411 bytes)." I tried setting the "memory_limit" and "max_execution_time" limit to limitless which it still gave me the error which leaves me to believe that the problem I'm encountering is hardware related.

 

Is there anyway to break this file up into sections (using PHP) or ever sort of... "paginating" it? I could split the file manually but with ~2 600 000 lines, that'll take an extremely long time.

 

<?php
ini_set('memory_limit', -1);
set_time_limit(0);
$handle = fopen('imports/flatfile.txt', 'r');
echo 'INSERT INTO `districtbase` (`districtkey`, `name`, `countrykey`, `regionkey`) VALUES ';
while($line = fgets($handle, filesize('imports/flatfile.txt'))){
	$cells = explode(',', $line);
	echo '("'.$cells[1].'", "'.$cells[2].'", "'.$cells[0].'", "'.$cells[3].'"),<br/>';
}
?>

 

SAMPLE-DATA:
CountrSHy,CityKey,CityName,Region,Latitude,Longitude
ad,aixas,Aixàs,06,,42.4833333,1.4666667

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.