Jump to content

Dealing with large text file


bljepp69

Recommended Posts

I have a large text file with approx. 400,000 lines.  The file is tab delimited and it's very easy to parse each line.  I'm working to grab certain information from each line and put into a mySQL database.  I'm running into a big issue where the script starts working and simply times out after processing somewhere between 1,000-2,000 lines, or so.  I suspect this is an issue with max_execution_time, but I don't have access to change that and set_time_limit() doesn't seem to over-ride max_execution_time.

My question is, what is the best way to deal with this large file?  The best solution I've come up with is to break the file into smaller chunks and then write the script to deal with many files.  Ideally, I would use a cron server to launch the script every so many minutes and just come back later when it's, hopefully, done.

How do people normally deal with data files like this?

Thanks for the discussion.
Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.