jwwceo Posted September 29, 2010 Share Posted September 29, 2010 I am developing a database app for a client who needs to import hundreds of thousands of codes into the DB to check against. The codes are in 4 text files about 30MB each. The codes are 3 per line, then a line break, and 3 more. Ive written a script to parse out the line breaks, turn the data into an array, then loop over thay array and insert into the DB. The problem is these scripts take minutes to run using file_get_contents and by the time the data is ready the mmysql connection is gone. Plus even these files only work after Ive cut the files into about 1MB each, so each file is 30 smaller ones. Is there a way to just put the text file on the server, and have php search it using a GREP like function that won't be such a burden to work with. Any advice helps. James Quote Link to comment https://forums.phpfreaks.com/topic/214782-how-to-work-with-massive-data-files/ Share on other sites More sharing options...
jwwceo Posted September 29, 2010 Author Share Posted September 29, 2010 UPDATE...there are about 18 million codes...they are all 4 characters...if this helps give the extent of the size issue. Quote Link to comment https://forums.phpfreaks.com/topic/214782-how-to-work-with-massive-data-files/#findComment-1117403 Share on other sites More sharing options...
petroz Posted September 29, 2010 Share Posted September 29, 2010 If you can get it into a csv file, you can import the csv file right into mysql without php at all. Take a look at. http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html Quote Link to comment https://forums.phpfreaks.com/topic/214782-how-to-work-with-massive-data-files/#findComment-1117404 Share on other sites More sharing options...
mikosiko Posted September 29, 2010 Share Posted September 29, 2010 LOAD DATA INFILE will be my choice http://dev.mysql.com/doc/refman/5.1/en/load-data.html table without indexes improve even more the load performance Quote Link to comment https://forums.phpfreaks.com/topic/214782-how-to-work-with-massive-data-files/#findComment-1117405 Share on other sites More sharing options...
Kryptix Posted September 29, 2010 Share Posted September 29, 2010 Why not use PHP in command prompt and open up a new connection each time you're importing rather than just once? Quote Link to comment https://forums.phpfreaks.com/topic/214782-how-to-work-with-massive-data-files/#findComment-1117406 Share on other sites More sharing options...
roopurt18 Posted September 29, 2010 Share Posted September 29, 2010 petroz and mikosiko are proposing the correct solutions here. Quote Link to comment https://forums.phpfreaks.com/topic/214782-how-to-work-with-massive-data-files/#findComment-1117408 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.