Jump to content


Photo

Bulk insert


  • Please log in to reply
4 replies to this topic

#1 alpine

alpine
  • Members
  • PipPipPip
  • Advanced Member
  • 756 posts
  • LocationNorway

Posted 08 September 2006 - 07:13 PM

I have a need to insert approx 400.000 rows once a month, and this is done from parsing 4 text files. Once this is done, i dump the old tables and renaming my new ones.

However - My isp has a insert/update limit of 36.000 rows per hours - so its a real time-spender  :o

Then i was tipped about the bulk insert possibility, from the mysql manual:

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

Now, i havent startet rebuilding my update/insert script for this yet, as i'm not sure how many bulks of values i can attatch in one query..... any oppinions on this ? I can hardly imagine that i can insert 100.000 rows on one query like this... or can i ???


#2 fenway

fenway
  • Staff Alumni
  • MySQL Si-Fu / PHP Resident Alien
  • 16,199 posts
  • LocationToronto, ON

Posted 08 September 2006 - 07:38 PM

It'll come down to your server's max_packet_size setting, usually defaults to ~1MB, so you'd be surprised how much text that is.  You can, of course, determine this dynamically; and with bulk inserts, you should probably be using INSERT IGNORE, if appropriate, and potentially, you should disable the keys and/or lock the table, if possible.
Seriously... if people don't start reading this before posting, I'm going to consider not answering at all.

#3 alpine

alpine
  • Members
  • PipPipPip
  • Advanced Member
  • 756 posts
  • LocationNorway

Posted 09 September 2006 - 09:19 AM

Thanks.
Lock tables is not nessesary since this is a new table later to be renamed replacing another one, so no queries will be ran on it during insert.
I'll have to look on INSERT IGNORE what that means.
But why should i disable keys ??

#4 fenway

fenway
  • Staff Alumni
  • MySQL Si-Fu / PHP Resident Alien
  • 16,199 posts
  • LocationToronto, ON

Posted 09 September 2006 - 03:54 PM

You don't have to have the DB update the indexes all the time for no reason... it'll just slow things down.
Seriously... if people don't start reading this before posting, I'm going to consider not answering at all.

#5 alpine

alpine
  • Members
  • PipPipPip
  • Advanced Member
  • 756 posts
  • LocationNorway

Posted 09 September 2006 - 08:16 PM

ah, i see.

Thanks.

I just rebuilt my insert parser and now it inserts 10.000 new rows in one query without any probs (havent fount the max_packet_size setting) - i also made a meta-refresh link in 5 sek intervals to parse the textfile until all rows is inserted.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users