Jump to content

Archived

This topic is now archived and is closed to further replies.

esiason14

csv import question

Recommended Posts

I'm trying to update an existing table with csv data. I want to dump it into the table playerstats which is linked to the players table by player_id. So, I need to first find the player_id, that matches the first name and last name of each record in the file before I can import. Make sense? Anyone have any suggestions on how to go about this? It would be much appreciated.

Heres a sample record that I want to import:

[code]Lance Niekro,SF,9,1,2,0,0,0,2,2,1,0,0,0.222,0.222,0.364,0.586[/code]

Here is my playerstats table:

[code]CREATE TABLE `playerstats` (
  `stat_id` int(10) unsigned NOT NULL auto_increment,
  `player_id` smallint(5) unsigned default '0',
  `year` smallint(5) unsigned default NULL,
  `mlbteam_id` char(3) binary default NULL,
  `ab` int(11) default '0',
  `runs` int(11) default '0',
  `hits` int(11) default '0',
  `doubles` int(11) default '0',
  `triples` int(11) default '0',
  `hr` int(11) default '0',
  `rbi` int(11) default '0',
  `walks` int(11) default '0',
  `so` int(11) default '0',
  `sb` int(11) default '0',
  `cs` int(11) default '0',
  `avg` decimal(4,3) default '0.000',
  `slg` decimal(4,3) default '0.000',
  `obp` decimal(4,3) default '0.000',
  `ops` decimal(4,3) default '0.000',
  `wins` int(3) default '0',
  `losses` int(3) default '0',
  `ip` decimal(4,2) default '0.00',
  `ha` int(4) default '0',
  `bb` int(4) default '0',
  `ko` int(4) default '0',
  `era` decimal(3,2) default '0.00',
  `k9` decimal(3,2) default '0.00',
  `sobb` decimal(3,2) default '0.00',
  `saves` int(4) default '0',
  `er` int(4) default '0',
  `gs` int(4) default '0',
  `cg` int(4) default '0',
  `whip` decimal(3,2) default '0.00',
  PRIMARY KEY  (`stat_id`)
) TYPE=MyISAM PACK_KEYS=0 AUTO_INCREMENT=2962;

[/code]

PLAYERS Table
[code] `player_id` smallint(5) unsigned NOT NULL auto_increment,
  `lname` varchar(50) default NULL,
  `fname` varchar(50) default NULL,
  `picture_loc` varchar(25) default NULL,
  `mlbteam_id` char(3) binary default NULL,
  `league_abbv` char(2) binary default NULL,
  `position_id` char(3) binary default NULL,
  `height` varchar(6) NOT NULL default '',
  `weight` char(3) NOT NULL default '',
  `birth` varchar(10) default NULL,
  `college` varchar(50) NOT NULL default '',
  `jersey` smallint(2) default NULL,
  PRIMARY KEY  (`player_id`)
) TYPE=MyISAM AUTO_INCREMENT=1350;[/code]

Share this post


Link to post
Share on other sites
Well, the easy way is to use file() and explode() to parse the csv file, and then for each row send an update query to the database.

If the CSV is a big one, you may want to do it in fewer queries. You can do that by creating and loading a temp table (one query), and updating the real table by joining with the temp table (one query). I wrote myself a function called massupdate() to do this automatically, but it's part of a package and doesn't stand alone very well, so I'm afraid I can't easily post it for you. You may want to try your hand at building one though, it comes in handy surprisingly often.

Share this post


Link to post
Share on other sites
LOAD DATA LOCAL INFLE with appropriate field/line terminators is also an option.

Share this post


Link to post
Share on other sites

×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.