dflow Posted December 28, 2011 Share Posted December 28, 2011 i have tbl worldcitylist and worldregionlist structure: CREATE TABLE `worldcitylist` ( `country_id` int(11) NOT NULL, `countrycode` varchar(2) NOT NULL, `City` varchar(100) NOT NULL, `AccentCity` varchar(255) NOT NULL, `region_id` varchar(100) NOT NULL, `Lat` float(17,15) NOT NULL, `Lng` float(17,15) NOT NULL, ) ENGINE=MyISAM DEFAULT CHARSET=latin1; CREATE TABLE `worldregionlist` ( `country_id` int(11) NOT NULL, `countrycode` varchar(2) NOT NULL, `region_id` varchar(11) NOT NULL, `regionname` varchar(255) NOT NULL ) ENGINE=MyISAM DEFAULT CHARSET=latin1; worldcitylist.country_id is 0 , region_id is set worldregionlist.country_id is set with region_id as well i want to run the following query: UPDATE worldcitylist SET country_id = (SELECT country_id FROM worldregionlist WHERE worldcitylist.countrycode = worldregionlist.countrycode AND worldcitylist.region_id = worldregionlist.region_id ) WHERE 1 worldcitylist has 7,302,121 rows total my macbook should be able to handle it it is running , no error, no result how long should this take? what's wrong thanks Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/ Share on other sites More sharing options...
trq Posted December 28, 2011 Share Posted December 28, 2011 Where are you executing this query from? Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301803 Share on other sites More sharing options...
dflow Posted December 28, 2011 Author Share Posted December 28, 2011 Where are you executing this query from? phpmyadmin Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301804 Share on other sites More sharing options...
trq Posted December 28, 2011 Share Posted December 28, 2011 Why? PHPMyAdmin is a web application and will timeout trying to process that much data. No one wants to look at millions of rows of data in one hit anyway. At least try and be realistic about what your application should display. Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301809 Share on other sites More sharing options...
dflow Posted December 28, 2011 Author Share Posted December 28, 2011 Why? PHPMyAdmin is a web application and will timeout trying to process that much data. No one wants to look at millions of rows of data in one hit anyway. At least try and be realistic about what your application should display. im trying to update the world city list database with new country ids i have from a legacy website i want to update the table first then i want to offer this list in an autocomplete for {city_name,region_name,country_name } in a form to add addresses when siging up for a world wide service what do you suggest? how can i make it work? Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301812 Share on other sites More sharing options...
trq Posted December 28, 2011 Share Posted December 28, 2011 what do you suggest? how can i make it work? Use MySql's cli interface. Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301813 Share on other sites More sharing options...
dflow Posted December 28, 2011 Author Share Posted December 28, 2011 what do you suggest? how can i make it work? Use MySql's cli interface. ok here is what i get mysql> UPDATE worldcitylist SET country_id = (SELECT country_id FROM worldregionlist WHERE worldcitylist.countrycode = worldregionlist.countrycode AND worldcitylist.region_id = worldregionlist.region_id ) WHERE 1 LIMIT 30; Query OK, 0 rows affected, 30 warnings (0.67 sec) Rows matched: 30 Changed: 0 Warnings: 30 Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301833 Share on other sites More sharing options...
mikosiko Posted December 28, 2011 Share Posted December 28, 2011 Query OK, 0 rows affected, 30 warnings (0.67 sec) Rows matched: 30 Changed: 0 Warnings: 30 and the warnings are? Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301864 Share on other sites More sharing options...
The Little Guy Posted December 28, 2011 Share Posted December 28, 2011 you should add a limit to your sub-query, it should only return one result. I assume your warnings will be telling you that. Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1301949 Share on other sites More sharing options...
dflow Posted December 29, 2011 Author Share Posted December 29, 2011 you should add a limit to your sub-query, it should only return one result. I assume your warnings will be telling you that. i'm doing this through the OS terminal i dont get details for the warnings i added a LIMIT to the sub-query same 30 warinings. i tried to use this kind of UPDATE query: UPDATE worldcitylist, worldregionlist set worldcitylist.country_id= worldregionlist.country_id WHERE worldregionlist.Enable =1 AND worldregionlist.countrycode=worldcitylist.countrycode AND worldregionlist.region_id=worldcitylist.region_id LIMIT 30 worldcitylist was updated but not correctly Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302124 Share on other sites More sharing options...
Muddy_Funster Posted December 29, 2011 Share Posted December 29, 2011 Don't know if this will work but could be worth a try: UPDATE worldcitylist INNER JOIN worldregionlist as country ON (worldcitylist.countrycode=country.countrycode) INNER JOIN worldregionlist as region ON (worldcitylist.region_id = region.region_id) SET worldcitylist.country_id = country.country_id WHERE country.enabled = 1 LIMIT 30 Curious though - if you already have a matching countrycode filed in both tables, what's the point in all this? Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302160 Share on other sites More sharing options...
dflow Posted December 29, 2011 Author Share Posted December 29, 2011 Don't know if this will work but could be worth a try: UPDATE worldcitylist INNER JOIN worldregionlist as country ON (worldcitylist.countrycode=country.countrycode) INNER JOIN worldregionlist as region ON (worldcitylist.region_id = region.region_id) SET worldcitylist.country_id = country.country_id WHERE country.enabled = 1 LIMIT 30 Curious though - if you already have a matching countrycode filed in both tables, what's the point in all this? thanks, i'll try this this is a db for world cities i found, I need to integrate it with a db im using with country_ids already set(legacy) Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302172 Share on other sites More sharing options...
dflow Posted December 29, 2011 Author Share Posted December 29, 2011 Don't know if this will work but could be worth a try: UPDATE worldcitylist INNER JOIN worldregionlist as country ON (worldcitylist.countrycode=country.countrycode) INNER JOIN worldregionlist as region ON (worldcitylist.region_id = region.region_id) SET worldcitylist.country_id = country.country_id WHERE country.enabled = 1 LIMIT 30 Curious though - if you already have a matching countrycode filed in both tables, what's the point in all this? BTW apparently you can't use LIMIT with UPDATE and JOINS , well im running your query , no errors yet and it is running how long should this take? Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302189 Share on other sites More sharing options...
Muddy_Funster Posted December 29, 2011 Share Posted December 29, 2011 I would guess at a couple of minutes... have never tried a multi-million row update before Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302192 Share on other sites More sharing options...
dflow Posted December 29, 2011 Author Share Posted December 29, 2011 well if you have any idea how to deal with this: Lost connection to MySQL server during query.Read timeout (600 seconds) reached. – 600668 ms im running on a localmachine Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302193 Share on other sites More sharing options...
fenway Posted December 29, 2011 Share Posted December 29, 2011 You'll have to batch your updates -- based on UIDs, not random DB order and LIMIT. But you really should check out those warnings. Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302198 Share on other sites More sharing options...
Muddy_Funster Posted December 29, 2011 Share Posted December 29, 2011 eek never had that happen to me. Got this from the MySQL website though: You can also get these errors if you send a query to the server that is incorrect or too large. If mysqld receives a packet that is too large or out of order, it assumes that something has gone wrong with the client and closes the connection. If you need big queries (for example, if you are working with big BLOB columns), you can increase the query limit by setting the server's max_allowed_packet variable, which has a default value of 1MB. You may also need to increase the maximum packet size on the client end. More information on setting the packet size is given in Section C.5.2.10, “Packet too large”. An INSERT or REPLACE statement that inserts a great many rows can also cause these sorts of errors. Either one of these statements sends a single request to the server irrespective of the number of rows to be inserted; thus, you can often avoid the error by reducing the number of rows sent per INSERT or REPLACE. Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302201 Share on other sites More sharing options...
dflow Posted December 29, 2011 Author Share Posted December 29, 2011 eek never had that happen to me. Got this from the MySQL website though: You can also get these errors if you send a query to the server that is incorrect or too large. If mysqld receives a packet that is too large or out of order, it assumes that something has gone wrong with the client and closes the connection. If you need big queries (for example, if you are working with big BLOB columns), you can increase the query limit by setting the server's max_allowed_packet variable, which has a default value of 1MB. You may also need to increase the maximum packet size on the client end. More information on setting the packet size is given in Section C.5.2.10, “Packet too large”. An INSERT or REPLACE statement that inserts a great many rows can also cause these sorts of errors. Either one of these statements sends a single request to the server irrespective of the number of rows to be inserted; thus, you can often avoid the error by reducing the number of rows sent per INSERT or REPLACE. ok i use a GUI interface for the terminal well i hope im not going to damage my server how would i access the shell and run : mysql -u root -p mysql then : shell> mysql --max_allowed_packet=32M Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302218 Share on other sites More sharing options...
The Little Guy Posted December 29, 2011 Share Posted December 29, 2011 like this: mysql -u root --password=mypassword --max_allowed_packet=32M Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302220 Share on other sites More sharing options...
dflow Posted December 29, 2011 Author Share Posted December 29, 2011 You'll have to batch your updates -- based on UIDs, not random DB order and LIMIT. But you really should check out those warnings. how would i do that? Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302230 Share on other sites More sharing options...
Muddy_Funster Posted December 29, 2011 Share Posted December 29, 2011 batching it will require that you have a Unique IDentifier column (sorry if that sounds patronising, just covering the bases), then just append "AND UID BETWEEN 1 AND 100000" to the WHERE clause, and bump it up each time. You should be able to get away with 100000 records at a time. you could even write a script to go through it for you using a varible and a multiple of that variable untill the number of rows affected are less than the multiplier (or untill variable * multiplier hits your known table top of 7.something million), but this may need smaller incraments and make sure to close the connection after each batch or else set a persistant connection prior to running any of the batches and close it at the end of them all (should work nicely on a localhost/LAN setup). Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302236 Share on other sites More sharing options...
dflow Posted December 29, 2011 Author Share Posted December 29, 2011 batching it will require that you have a Unique IDentifier column (sorry if that sounds patronising, just covering the bases), then just append "AND UID BETWEEN 1 AND 100000" to the WHERE clause, and bump it up each time. You should be able to get away with 100000 records at a time. you could even write a script to go through it for you using a varible and a multiple of that variable untill the number of rows affected are less than the multiplier (or untill variable * multiplier hits your known table top of 7.something million), but this may need smaller incraments and make sure to close the connection after each batch or else set a persistant connection prior to running any of the batches and close it at the end of them all (should work nicely on a localhost/LAN setup). gotchya index the 7mil records Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1302243 Share on other sites More sharing options...
dflow Posted January 4, 2012 Author Share Posted January 4, 2012 You'll have to batch your updates -- based on UIDs, not random DB order and LIMIT. But you really should check out those warnings. how do i do that? Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1304170 Share on other sites More sharing options...
fenway Posted January 5, 2012 Share Posted January 5, 2012 SHOW WARNINGS. Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1304364 Share on other sites More sharing options...
dflow Posted January 5, 2012 Author Share Posted January 5, 2012 SHOW WARNINGS. i have no warnings i had an issue with mysql server timeouts which i fixed when running this query it doesnt execute, how do i use the UID method to split the query into parts, maybe the size of the table isn't letting it execute UPDATE worldcitylist INNER JOIN worldregionlist as country ON (worldcitylist.countrycode=country.countrycode) INNER JOIN worldregionlist as region ON (worldcitylist.region_id = region.region_id) SET worldcitylist.country_id = country.country_id Quote Link to comment https://forums.phpfreaks.com/topic/253941-help-with-query-millions-of-rows/#findComment-1304457 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.