scarhand Posted March 27, 2009 Share Posted March 27, 2009 I have a table for guitar tabs. this table has about 270,000 rows, totalling about 1,200 megabytes. Its causing the page load times to take up to 5 seconds or more. The structure is as follows: CREATE TABLE IF NOT EXISTS `tabs` ( `id` int(10) unsigned NOT NULL auto_increment, `song` text NOT NULL, `song_slug` text NOT NULL, `band` text NOT NULL, `band_slug` text NOT NULL, `tab` longtext NOT NULL, `version` int(10) unsigned NOT NULL, `rating` int(11) NOT NULL default '0', `views` int(10) unsigned NOT NULL default '0', PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=267636 ; In my select query, I am using mysql_fetch_assoc and am only selecting the fields I am actually using on the page. If I split this table up into 2, 1 with all these fields minus "tab", and 1 solely with "tab" and "tab_id", would it improve my page load time? Link to comment https://forums.phpfreaks.com/topic/151347-optimization-for-a-1-gigabyte-table-split-into-2/ Share on other sites More sharing options...
corbin Posted March 27, 2009 Share Posted March 27, 2009 Hrmmmm.... Each song only has 1 row in that table, yes? If so, I don't think you would gain anything by breaking it into more tables, and I think that might actually cause a performance hit because of a JOIN (not sure though). What are your indexes, and what does an example SELECT query look like? Link to comment https://forums.phpfreaks.com/topic/151347-optimization-for-a-1-gigabyte-table-split-into-2/#findComment-794911 Share on other sites More sharing options...
scarhand Posted March 27, 2009 Author Share Posted March 27, 2009 i don't know anything about indexes. heres an example of my select query for the band page (the page that lists all tabs for a band). this is the slowest page to load. $esc_band is simply the $_GET['band_slug'] variable protected with mysql_real_escape_string. <?php $sql = mysql_query("select song, song_slug, band, band_slug, version, rating, views from tabs where band_slug='$esc_band' order by song, version asc"); while ($row = mysql_fetch_assoc($sql)) { $song = $row['song']; $song_slug = $row['song_slug']; $band_slug = $row['band_slug']; $version = $row['version']; $rating = $row['rating']; $views = $row['views']; // additional coding using the variables here } ?> Link to comment https://forums.phpfreaks.com/topic/151347-optimization-for-a-1-gigabyte-table-split-into-2/#findComment-794912 Share on other sites More sharing options...
scarhand Posted March 27, 2009 Author Share Posted March 27, 2009 should i turn band_slug into an index? Link to comment https://forums.phpfreaks.com/topic/151347-optimization-for-a-1-gigabyte-table-split-into-2/#findComment-794913 Share on other sites More sharing options...
corbin Posted March 27, 2009 Share Posted March 27, 2009 You don't turn a column into an index, but you create an index on it. And yes, if you plan on searching by a column, you should almost always have an index on that field. Without an index, MySQL has to go through each row looking for rows with a column equal to that value until the end of the table (well... unless there's a LIMIT clause). Do something like: CREATE INDEX band_slug_idx ON tabs(band_slug); And see if that improves performance a little ;p. I should note by the way that indexes can take a substantial amount of disk space. They are necessary for quickly finding data in a table though. Link to comment https://forums.phpfreaks.com/topic/151347-optimization-for-a-1-gigabyte-table-split-into-2/#findComment-794915 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.