abazoskib Posted July 30, 2009 Share Posted July 30, 2009 my max_allowed_packet is set to 32MB. if my queries consist of a format like: INSERT into table from SELECT ... WHERE customer_id IN (213232,23232,2323, etc) how long of an array would i be able to manage successfully within the IN section? im not looking for an exact answer, but it would be awesome if i could put in 50,000 or more id numbers in there Quote Link to comment https://forums.phpfreaks.com/topic/168182-max_allowed_packet-and-size-of-query/ Share on other sites More sharing options...
rhodesa Posted July 30, 2009 Share Posted July 30, 2009 If you put 50,000 ID numbers in an IN statement, your query will be pretty slow. Most likely there is a MUCH better way to do this with a JOIN. What are you trying to do exactly? Quote Link to comment https://forums.phpfreaks.com/topic/168182-max_allowed_packet-and-size-of-query/#findComment-887068 Share on other sites More sharing options...
abazoskib Posted July 30, 2009 Author Share Posted July 30, 2009 ok here it goes. there is data in a stage 1 table. it needs to be "cleaned/processed" instead of performing inserts and updates on each single record, i am saving the ID numbers of each row into an array after they have been processed -> one array for the good ones, one for the bad now at the end of the processing i perform a bulk insert using insert into final_tbl from select xxx from first_tbl ... where xx IN( id#, id#, id#...) and the same for the deletion of the bad numbers. this cut down my queries from 3 per record to about 4 total queries. the speed gain was UNBELIEVABLE. if there is an even faster way to do this I would love to hear it. Quote Link to comment https://forums.phpfreaks.com/topic/168182-max_allowed_packet-and-size-of-query/#findComment-887074 Share on other sites More sharing options...
rhodesa Posted July 30, 2009 Share Posted July 30, 2009 Hum...wow...that is a tough one. Personally I would still go with a per-record query, that way if the script fails for some reason half way through, you don't loose your place. If it helps, you can always add a column and flag it with an UPDATE query as you go. Then, add the end you can do: INSERT into table from SELECT ... WHERE processed = 1 Worst case, if you are worried it getting truncated, just split the array up into 1,000 record chunks with array_chunk(): <?php foreach(array_chunk($array,1000) as $list){ $sql = "INSERT into table from SELECT ... WHERE customer_id IN (".implode(',',$list).")"; mysql_query($sql); } ?> Quote Link to comment https://forums.phpfreaks.com/topic/168182-max_allowed_packet-and-size-of-query/#findComment-887080 Share on other sites More sharing options...
abazoskib Posted July 30, 2009 Author Share Posted July 30, 2009 Hum...wow...that is a tough one. Personally I would still go with a per-record query, that way if the script fails for some reason half way through, you don't loose your place. If it helps, you can always add a column and flag it with an UPDATE query as you go. Then, add the end you can do: INSERT into table from SELECT ... WHERE processed = 1 Worst case, if you are worried it getting truncated, just split the array up into 1,000 record chunks with array_chunk(): <?php foreach(array_chunk($array,1000) as $list){ $sql = "INSERT into table from SELECT ... WHERE customer_id IN (".implode(',',$list).")"; mysql_query($sql); } ?> one step ahead of you, before the query goes through, i serialize and save to file the arrays of good and bad IDs. if the source fails, I simply load up the arrays and place them into the proper table. nice point with the array chunk function. Quote Link to comment https://forums.phpfreaks.com/topic/168182-max_allowed_packet-and-size-of-query/#findComment-887086 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.