Jump to content

Recommended Posts

I am developing a database app for a client who needs to import hundreds of thousands of codes into the DB to check against. The codes are in 4 text files about 30MB each.

 

The codes are 3 per line, then a line break, and 3 more. Ive written a script to parse out the line breaks, turn the data into an array, then loop over thay array and insert into the DB. The problem is these scripts take minutes to run using file_get_contents and by the time the data is ready the mmysql connection is gone. Plus even these files only work after Ive cut the files into about 1MB each, so each file is 30 smaller ones.

 

Is there a way to just put the text file on the server, and have php search it using a GREP like function that won't be such a burden to work with. Any advice helps.

 

James

 

Link to comment
https://forums.phpfreaks.com/topic/214782-how-to-work-with-massive-data-files/
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.