Jump to content

reading huge text file help :(


sasori

Recommended Posts

Hi, I have this text file with a 123mb size

then i ran my simple script to read it and render it via browser

$fh = fopen('worldcitiespop.txt',r);
while(!feof($fh))
{
  $content = fgets($fh);
  echo $content."<br />";
}
fclose($fh);

I was expecting the 'whole' output via browser but then it didn't show "all"

because when I opened the text file via a wordpad, it's really huge..

can you tell me what's wrong with my script ?...( btw, there's no max execution time error at all )

Link to comment
https://forums.phpfreaks.com/topic/215842-reading-huge-text-file-help/
Share on other sites

I would have thought that fread() would be the way to go here, rather than fgets(), the only thing is that I am unsure about the size of file, I haven't tried it with a file that big before... I purge data every 12 hours/or archive to avoid large file sizes.

 

Rw

I would have thought that fread() would be the way to go here, rather than fgets(), the only thing is that I am unsure about the size of file, I haven't tried it with a file that big before... I purge data every 12 hours/or archive to avoid large file sizes.

Rw

 

what do you mean purge data every 12 hours or archive ?

can you explain this further ?

 

(btw, nice avatar hehe )

I was expecting the 'whole' output via browser but then it didn't show "all"

because when I opened the text file via a wordpad, it's really huge..

 

What did it show? Quite likely you're using up all the memory. Try increasing it with:

 

ini_set('memory_limit', '150M');

 

Although I'll add; displaying 128mb worth of data in a browser isn't the best idea.

I was expecting the 'whole' output via browser but then it didn't show "all"

because when I opened the text file via a wordpad, it's really huge..

 

What did it show? Quite likely you're using up all the memory. Try increasing it with:

 

ini_set('memory_limit', '150M');

 

Although I'll add; displaying 128mb worth of data in a browser isn't the best idea.

 

it rendered only 1/4 of the data from the text file...

I'm currently running it with the ini_set you said..lemme see it that would help

>>what do you mean purge data every 12 hours or archive ?

 

I pipe all of my mysql errors into a text file outside of the server root, and to avoid large file sizes (if you are appending data to the end of the file) I remove the contents and place into a new file for archive, and usually keep that backup somewhere else on the server for posterity, then start afresh after that process is complete, though I can set the period to what ever I want, but 12 hourly seems ok, but I only ever get errors if the server goes down, then I just pop to a holding page saying, try again later..

 

And yes, 128M to display on a browser isn't a good Idea, what are you trying to achieve?

 

PS: Dilbert Rocks!!

 

Rw

Objectives:

1) to view the whole data of the text file and render it via browser

2) if step 1 is ok and was able to see the complete file, then I'll extract one of the comma delimited column and insert it in a db table field

 

by setting the ini_set to a higher memory_limit , the file's data was rendered half way, the other half got halted. and i got this error

Fatal error: Maximum execution time of 60 seconds exceeded in C:\xampp\htdocs\test3\test2.php on line 17

  • 2 months later...

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.