eric1235711 Posted October 3, 2006 Share Posted October 3, 2006 hello,I got a table with more than 500.000 records. It´s a list of adresses and postal-codes (I don´t know if this is the right word, here it cals CEP)I find that this table overloaded the server, a simple query in it was pocessing a long time... I find that it hapened when a lot of people accessed it at same time...Do you think that this table is too large?In all the system there is an only one simple query that calls this table, looking for the address of a CEP number. This CEP field is a key.Now the server looks like running normaly, but I´m thinking about hashing this table... in ten or a hundred tables...What do you think? Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/ Share on other sites More sharing options...
obsidian Posted October 3, 2006 Share Posted October 3, 2006 500,000 records shouldn't be too much of a load for a simple query if the table and the query both are properly optimized (well, for a good server, that is). if your table can be normalized further into wider categorization, that would be good. if you need some input on that aspect, just post your table schema and we may be able to help with that, too. Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103016 Share on other sites More sharing options...
eric1235711 Posted October 3, 2006 Author Share Posted October 3, 2006 the query is very simple, eg: SELECT * FROM addresses WHERE cep = '01001000'It always returns a single row.a shema is this?Field Type Null Default ID int(11) Yes NULL TYPE varchar(20) Yes NULL ADDRESS varchar(60) Yes NULL NUMBSTRT varchar(8) Yes NULL NUMBEND varchar(8) Yes NULL SIDE varchar(1) Yes NULL SUBURB varchar(50) Yes NULL CEP int(8) Yes NULL CITY varchar(38) Yes NULL CITYID varchar(5) Yes NULL STATE varchar(2) Yes NULL ESPECIAL varchar(70) Yes NULL Indexes: Keyname Type Cardinality Field PRIMARY PRIMARY 559819 ID CEP INDEX 559819 CEP Space usage: Type Usage Data 39,707 KB Index 11,797 KB Total 51,504 KB Row Statistics: Statements Value Format dynamic Rows 559,819 Row length ø 72 Row size ø 94 Bytes Next Autoindex 559,820 Creation Oct 02, 2006 at 06:30 PM Last update Oct 02, 2006 at 06:30 PM Last check Oct 02, 2006 at 06:30 PM Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103032 Share on other sites More sharing options...
fenway Posted October 3, 2006 Share Posted October 3, 2006 500K is not a big deal... there are a few things you can do to speed things up, though. First, fixed-width tables are faster to query.. consider changing your VARCHARs to CHARs. Second, the index is very large -- you can make it smaller by (A) having the CEP serve as the PK (especially if you ZEROFILL it, it can stay an INT) and then (B) use a short prefix. If you can load the index into memory, it will be much faster to query.Third, if you don't need back * always, you can make a multi-column index, and never have to read the table for the one column you need back, which will be really fast.Of course, splitting the table horizontally may also be required, but I wouldn't suggest it just yet. Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103092 Share on other sites More sharing options...
eric1235711 Posted October 3, 2006 Author Share Posted October 3, 2006 Fenway,you said to (B) use a shot prefix, you mean that I shoud make a column of (eg) the two (or three) first digits of CEP? Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103216 Share on other sites More sharing options...
fenway Posted October 3, 2006 Share Posted October 3, 2006 Well, since it's 5.4 postal codes, I'd say 5 would be a good length to choose ;-) Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103372 Share on other sites More sharing options...
eric1235711 Posted October 4, 2006 Author Share Posted October 4, 2006 5.3 ... Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103665 Share on other sites More sharing options...
eric1235711 Posted October 4, 2006 Author Share Posted October 4, 2006 I saw that what was overloading is that list of Sleeping processes... Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103679 Share on other sites More sharing options...
Vikas Jayna Posted October 4, 2006 Share Posted October 4, 2006 If its the number of sleeping processes is large then you may have to check the wait_timeout variable in mysql and reduce it if its set too large. Ideally, if there are too many waiting processes then the problem should not be with this query but establishing the mysql connection. Are you getting the 'Too many connections' error? Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-103753 Share on other sites More sharing options...
eric1235711 Posted October 5, 2006 Author Share Posted October 5, 2006 [quote author=Vikas Jayna link=topic=110362.msg446698#msg446698 date=1159976888]Are you getting the 'Too many connections' error?[/quote]yes, some times... Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-104478 Share on other sites More sharing options...
eric1235711 Posted October 5, 2006 Author Share Posted October 5, 2006 I made a "serial-killer" program... lolit kills the processes that sleeps too much Link to comment https://forums.phpfreaks.com/topic/22864-help-with-huge-table/#findComment-104481 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.