Jump to content

max_allowed_packet and size of query


abazoskib

Recommended Posts

my max_allowed_packet is set to 32MB. if my queries consist of a format like:

 

INSERT into table from SELECT ... WHERE customer_id IN (213232,23232,2323, etc)

 

how long of an array would i be able to manage successfully within the IN section? im not looking for an exact answer, but it would be awesome if i could put in 50,000 or more id numbers in there

Link to comment
https://forums.phpfreaks.com/topic/168182-max_allowed_packet-and-size-of-query/
Share on other sites

ok here it goes. there is data in a stage 1 table. it needs to be "cleaned/processed" instead of performing inserts and updates on each single record, i am saving the ID numbers of each row into an array after they have been processed -> one array for the good ones, one for the bad

 

now at the end of the processing i perform a bulk insert using

 

insert into final_tbl from select xxx from first_tbl ... where xx IN( id#, id#, id#...)

 

and the same for the deletion of the bad numbers. this cut down my queries from 3 per record to about 4 total queries. the speed gain was UNBELIEVABLE. if there is an even faster way to do this I would love to hear it.

Hum...wow...that is a tough one. Personally I would still go with a per-record query, that way if the script fails for some reason half way through, you don't loose your place.

 

If it helps, you can always add a column and flag it with an UPDATE query as you go. Then, add the end you can do:

INSERT into table from SELECT ... WHERE processed = 1

 

Worst case, if you are worried it getting truncated, just split the array up into 1,000 record chunks with array_chunk():

<?php
  foreach(array_chunk($array,1000) as $list){
    $sql = "INSERT into table from SELECT ... WHERE customer_id IN (".implode(',',$list).")";
    mysql_query($sql);
  }
?>

Hum...wow...that is a tough one. Personally I would still go with a per-record query, that way if the script fails for some reason half way through, you don't loose your place.

 

If it helps, you can always add a column and flag it with an UPDATE query as you go. Then, add the end you can do:

INSERT into table from SELECT ... WHERE processed = 1

 

Worst case, if you are worried it getting truncated, just split the array up into 1,000 record chunks with array_chunk():

<?php
  foreach(array_chunk($array,1000) as $list){
    $sql = "INSERT into table from SELECT ... WHERE customer_id IN (".implode(',',$list).")";
    mysql_query($sql);
  }
?>

 

one step ahead of you, before the query goes through, i serialize and save to file the arrays of good and bad IDs. if the source fails, I simply load up the arrays and place them into the proper table.

 

nice point with the array chunk function.

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.