Jump to content

max_allowed_packet and size of query


abazoskib

Recommended Posts

my max_allowed_packet is set to 32MB. if my queries consist of a format like:

 

INSERT into table from SELECT ... WHERE customer_id IN (213232,23232,2323, etc)

 

how long of an array would i be able to manage successfully within the IN section? im not looking for an exact answer, but it would be awesome if i could put in 50,000 or more id numbers in there

Link to comment
Share on other sites

ok here it goes. there is data in a stage 1 table. it needs to be "cleaned/processed" instead of performing inserts and updates on each single record, i am saving the ID numbers of each row into an array after they have been processed -> one array for the good ones, one for the bad

 

now at the end of the processing i perform a bulk insert using

 

insert into final_tbl from select xxx from first_tbl ... where xx IN( id#, id#, id#...)

 

and the same for the deletion of the bad numbers. this cut down my queries from 3 per record to about 4 total queries. the speed gain was UNBELIEVABLE. if there is an even faster way to do this I would love to hear it.

Link to comment
Share on other sites

Hum...wow...that is a tough one. Personally I would still go with a per-record query, that way if the script fails for some reason half way through, you don't loose your place.

 

If it helps, you can always add a column and flag it with an UPDATE query as you go. Then, add the end you can do:

INSERT into table from SELECT ... WHERE processed = 1

 

Worst case, if you are worried it getting truncated, just split the array up into 1,000 record chunks with array_chunk():

<?php
  foreach(array_chunk($array,1000) as $list){
    $sql = "INSERT into table from SELECT ... WHERE customer_id IN (".implode(',',$list).")";
    mysql_query($sql);
  }
?>

Link to comment
Share on other sites

Hum...wow...that is a tough one. Personally I would still go with a per-record query, that way if the script fails for some reason half way through, you don't loose your place.

 

If it helps, you can always add a column and flag it with an UPDATE query as you go. Then, add the end you can do:

INSERT into table from SELECT ... WHERE processed = 1

 

Worst case, if you are worried it getting truncated, just split the array up into 1,000 record chunks with array_chunk():

<?php
  foreach(array_chunk($array,1000) as $list){
    $sql = "INSERT into table from SELECT ... WHERE customer_id IN (".implode(',',$list).")";
    mysql_query($sql);
  }
?>

 

one step ahead of you, before the query goes through, i serialize and save to file the arrays of good and bad IDs. if the source fails, I simply load up the arrays and place them into the proper table.

 

nice point with the array chunk function.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.