bljepp69 Posted November 1, 2006 Share Posted November 1, 2006 I have a large text file with approx. 400,000 lines. The file is tab delimited and it's very easy to parse each line. I'm working to grab certain information from each line and put into a mySQL database. I'm running into a big issue where the script starts working and simply times out after processing somewhere between 1,000-2,000 lines, or so. I suspect this is an issue with max_execution_time, but I don't have access to change that and set_time_limit() doesn't seem to over-ride max_execution_time.My question is, what is the best way to deal with this large file? The best solution I've come up with is to break the file into smaller chunks and then write the script to deal with many files. Ideally, I would use a cron server to launch the script every so many minutes and just come back later when it's, hopefully, done.How do people normally deal with data files like this?Thanks for the discussion. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.