hyeteck Posted March 16, 2008 Share Posted March 16, 2008 Hi, is there a way to combine 17,000 UPDATE queries into 1 query? I have a php script that runs through a while loop and updates a price field for 17,000 rows, hence 17,000 queries. Since i'm on a share host, the MySQL server is on a different machine so sending 17,000 queries takes time. Is there a way to combine them together? Here is my script. function update_prices() { $pri = array(0, 10, 25, 50, 100,200, 300, 400, 500, 600, 700,800, 900, 1000); $f = array(1.8,1.7,1.6,1.5,1.3,1.09,1.08,1.13,1.12,1.11,1.1,1.09,1.08,1.07); $file = "items.txt"; $contents = file($file); if(count($contents)>1000) { foreach($contents as $buffer) { $chunks = explode(" ", $buffer); $prodId = $chunks[0]; $prodPrice = $chunks[1]; $prodPriceW = $prodPrice; $prodMap = $chunks[2]; $prodQuant = $chunks[3]; if($prodPrice==0) { $result = mysql_query("UPDATE products SET pSell=0, pDisplay=0 WHERE pID='$prodId'") or die(mysql_error()); } else { $c=14; while ($pri[--$c] > $prodPrice); $prodPrice *= $f[$c]; $prodPrice = floor($prodPrice); $prodPrice += 0.99; print $prodId . "\n"; $query = "UPDATE products SET pPrice= IF(pFreight = 1,'$prodPrice + 90','$prodPrice'), pInStock='$prodQuant', pMapprice='$prodMap', pWholesalePrice='$prodPriceW' WHERE pID='$prodId'"; $result = mysql_query($query) or die(mysql_error()); } // end of prodPrice == 0 else } // end of foreach loop } // end of count > 1000 } // end of function thanks Quote Link to comment Share on other sites More sharing options...
PFMaBiSmAd Posted March 17, 2008 Share Posted March 17, 2008 Just in general, each different value would need a different UPDATE query. If you could pre-process the data so that you know which id's will be updated to the same values, you could produce an update query with a WHERE clause that has a list of id's that would be updated all at once. Also, does a lot of your data change? The way the code is now, an update query is executed for each row. If the data values are the same, mysql does not actually do the write part of the UPDATE, but it still must do the read part of the UPDATE. You might be able to read all the rows from the database into an array and then do a comparison to find out what rows need to be updated, then simply execute the update query for only the rows that you need to change. This could reduce the time if only a fraction of the rows are changed, but if the quantity changes in most of the rows so that they all need to be updated anyway, it won't help. Edit: You might actually be able to do a LOAD DATA LOCAL INFILE statement with the correct ignore/overwrite options to get this done batch mode. This might require that the data file be pre-processed by php and written out in a different format first with the price already figured. Quote Link to comment Share on other sites More sharing options...
hyeteck Posted March 17, 2008 Author Share Posted March 17, 2008 All the IDs have different values so trying to group together the IDs would not work. I like the idea of checking which values need to be changed to limit the number of queries that way. What i really like most is your idea of LOAD DATA INFILE. Pre-processing the data file would not be an issue, i just gotta figure out the format the data needs to be in. looking here http://dev.mysql.com/doc/refman/5.0/en/load-data.html i don't see information about using UPDATE in the data infile. Quote Link to comment Share on other sites More sharing options...
fenway Posted March 17, 2008 Share Posted March 17, 2008 There is an optional SET caluse. Quote Link to comment Share on other sites More sharing options...
hyeteck Posted March 17, 2008 Author Share Posted March 17, 2008 hmm, im a little confused. Lets say i have the following data ProdID Quantity Price 5 22 15.99 Can you show me a sample datafile that would update the above data for that ProdID? Quote Link to comment Share on other sites More sharing options...
hyeteck Posted March 17, 2008 Author Share Posted March 17, 2008 nevermind, i got it, thanks. I'm going to try this out now and see what kind of performance i get compared to running 17,000 separate queries. Quote Link to comment Share on other sites More sharing options...
aschk Posted March 17, 2008 Share Posted March 17, 2008 Load data infile will be way quicker I would imagine. The other option is to use prepared statements, although I don't believe the old mysql_* functions support this so you'd have to use PDO or mysqli_* instead. Curiously however, I would ask why are you getting a file with ALL prices updated. Surely the supplied file should contain the latest price changes? Or is this exported from some other system? Either way it seems like a nightly batch job of a sort, for which LOAD DATA INFILE is going to be your best option. Quote Link to comment Share on other sites More sharing options...
keeB Posted March 18, 2008 Share Posted March 18, 2008 Or, you can do it the proper way which is to put it inside of a transaction. http://dev.mysql.com/books/mysqlpress/mysql-tutorial/ch10.html This is what they were made for Quote Link to comment Share on other sites More sharing options...
aschk Posted March 18, 2008 Share Posted March 18, 2008 To te best of my knowledge MySQL doesn't offer file reading and parsing as an option inside transactions... And you can't have a MySQL procedure utilise an array of information. Quote Link to comment Share on other sites More sharing options...
keeB Posted March 18, 2008 Share Posted March 18, 2008 start transaction insert into insert into insert into insert into xhow many ever inserts commit; ^ the only way to do it. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.