Jump to content

Dealing with large text file


bljepp69

Recommended Posts

I have a large text file with approx. 400,000 lines.  The file is tab delimited and it's very easy to parse each line.  I'm working to grab certain information from each line and put into a mySQL database.  I'm running into a big issue where the script starts working and simply times out after processing somewhere between 1,000-2,000 lines, or so.  I suspect this is an issue with max_execution_time, but I don't have access to change that and set_time_limit() doesn't seem to over-ride max_execution_time.

My question is, what is the best way to deal with this large file?  The best solution I've come up with is to break the file into smaller chunks and then write the script to deal with many files.  Ideally, I would use a cron server to launch the script every so many minutes and just come back later when it's, hopefully, done.

How do people normally deal with data files like this?

Thanks for the discussion.
Link to comment
https://forums.phpfreaks.com/topic/25837-dealing-with-large-text-file/
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.