chilly98 Posted June 10, 2009 Share Posted June 10, 2009 Hi all, I have a CSV file full of data that I want to import periodically into my database. As I understand it, I can do this through this MySQL command: LOAD DATA LOCAL INFILE '/my_file.csv' INTO TABLE test_table FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' (field1, filed2, field3); And I could also create a php script to do the same thing. Something like this: <? $fcontents = file ('./myfile.csv'); # expects the csv file to be in the same dir as this script for($i=0; $i<sizeof($fcontents); $i++) { $line = trim($fcontents[$i]); $arr = explode("\t", $line); #if your data is comma separated # instead of tab separated, # change the '\t' above to ',' $sql = "insert into some_table values ('". implode("','", $arr) ."')"; mysql_query($sql); echo $sql ."<br>\n"; if(mysql_error()) { echo mysql_error() ."<br>\n"; } } ?> However, I intend to modify the data in the CSV file from time to time. What I would like to do periodically would be to re-import the CSV file and update the database records without having duplicate entries. What would you recommend as a good way to do this? Many thanks for your help! Quote Link to comment Share on other sites More sharing options...
fenway Posted June 11, 2009 Share Posted June 11, 2009 Update how? Quote Link to comment Share on other sites More sharing options...
chilly98 Posted June 11, 2009 Author Share Posted June 11, 2009 Here's the exact structure of one of my database tables. CREATE TABLE IF NOT EXISTS `people` ( `id` int(5) NOT NULL auto_increment, `lastname` varchar(50) collate latin1_general_ci default NULL, `firstname` varchar(50) collate latin1_general_ci default NULL, `salary` double unsigned default NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1 COLLATE=latin1_general_ci AUTO_INCREMENT=78 ; I have an excel spreadsheet that someone else maintains containing individual records of people's first names, last names, and salary. For the purposes of this question, let's say there are 100 records in the spreadsheet. First I save the excel file as a CSV file, and I use the "LOAD DATA LOCAL INFILE" command to import the 100 records into my database. Let's say now that 1 month down the road the salary for one of those people changes and another person is added (record 101). The changes are made by my colleague in the excel spreadsheet. After this spreadsheet is once again saved in a CSV file, how can I re-import the data so that the information in my database is accurate? What I'd like to see in the database is the new salary that has been modified as well as the 101st person. I'm sorry if this is confusing. I'm new to mySQL but would certainly appreciate some help. Thanks again! Quote Link to comment Share on other sites More sharing options...
fenway Posted June 15, 2009 Share Posted June 15, 2009 I think LOAD DATA INFILE has the equivalent on "ON DUPLICATE KEY UPDATE". Quote Link to comment Share on other sites More sharing options...
prophecym Posted October 8, 2012 Share Posted October 8, 2012 (edited) I have the same question. Can anyone please elaborate this? I think LOAD DATA INFILE has the equivalent on "ON DUPLICATE KEY UPDATE". While importing a .csv file, how do you check for duplicate rows (or even cells) and update it if necessary? Edited October 8, 2012 by prophecym Quote Link to comment Share on other sites More sharing options...
Barand Posted October 11, 2012 Share Posted October 11, 2012 from mysql manual LOAD DATA The REPLACE and IGNORE keywords control handling of input rows that duplicate existing rows on unique key values: If you specify REPLACE, input rows replace existing rows. In other words, rows that have the same value for a primary key or unique index as an existing row. See Section 13.2.8, “REPLACE Syntax”. If you specify IGNORE, input rows that duplicate an existing row on a unique key value are skipped. If you do not specify either option, the behavior depends on whether the LOCAL keyword is specified. Without LOCAL, an error occurs when a duplicate key value is found, and the rest of the text file is ignored. With LOCAL, the default behavior is the same as if IGNORE is specified; this is because the server has no way to stop transmission of the file in the middle of the operation. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.