enveetee Posted February 15, 2015 Share Posted February 15, 2015 Hi I have a desktop system which uses plain text files for its storage and I need to do a one off import of these files (tables) into a SQL table The largest table as just under 2 million records each of 850 bytes in length I have written a PHP script and used set_time_limit to zero. The server ran for about 8 hours but did not import the entire table Maybe I should be uploading a CSV? What is the best way to go about this - best practice/suggestions please Link to comment https://forums.phpfreaks.com/topic/294621-importing-legacy-data-into-mysql/ Share on other sites More sharing options...
mac_gyver Posted February 15, 2015 Share Posted February 15, 2015 the following methods for inserting bulk data, are from most to least efficient - 1) use a LOAD DATA [LOCAL] INFILE query. 2) use a multi-value INSERT query, with as many rows as possible in one query. the length of each query is limited by the max_allowed_packet mysql setting, default 1M bytes, which can be changed, but the default should allow you to insert at least 1000 rows in each query. 3) use a prepared query in a loop. this is only slightly more efficient then method #4, since the data for each query must still be communicated to the database server and it's the hand-shaking and communications between php and the database server where most of the time is spent (queries and data are sent as character strings.) 4) use a non-prepared query in a loop. this is likely what you are doing now. turning off indexes, auto-commits, and unique constraints on the tables will also speed up the process, see the following link - http://dev.mysql.com/doc/refman/5.6/en/insert-speed.html Link to comment https://forums.phpfreaks.com/topic/294621-importing-legacy-data-into-mysql/#findComment-1505730 Share on other sites More sharing options...
enveetee Posted February 16, 2015 Author Share Posted February 16, 2015 I am trying to upload a large dataset in CSV format for batch loading into a MYSQL table using LOAD DATA INFILE The CSV file (test.csv) is in /var/services/web/php My table is in /var/services/mysql/TABLENAME When I run the SQL command from a PHP script I get an error ERROR: Can't get stat of '/var/services/mysql/TABLENAME/test.csv' (Errcode: 2) Which I believe is because the file needs to be in /var/services/mysql/TABLENAME. Indeed, if I move it there, it works fine If I try and use the LOCAL switch LOAD DATA LOCAL INFILE I get an error stating that feature is not available. Research shows it can cause a security hole and needs to be enabled separately, something I am not prepared to do I cant copy the file in my PHP script to /var/services/mysql/TABLENAME as it is outside or /var/services/web So short of manually copying the FTP'd file to my table directory, how could I copy it from my PHP script? This also leads to 'how do I access files outside my web root' Thanks My scripts is in /var/services/web/php , this directory is not accessable from my PHP script. I can FTP the CSV file to the server but I cannot copy the file to my table's directory and I cannot assess the FTP folder from within the LOAD DATA INFILE Link to comment https://forums.phpfreaks.com/topic/294621-importing-legacy-data-into-mysql/#findComment-1505789 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.