phpsycho Posted October 21, 2011 Share Posted October 21, 2011 Okay so I have two different sites, both search engines though. They both have around the same amount of rows in their databases. One finds results super fast like within a half a second. The other one takes 4-8 seconds to search. I use the same method on both the sites though.. I just don't get it.. $search = "SELECT * FROM `search_links` WHERE MATCH(`description`,`keywords`,`title`) AGAINST ('$keyword $banned' IN BOOLEAN MODE) ORDER BY `realdesc` DESC, `realkey` DESC LIMIT $skip, 15"; So i'm not sure what is going on really.. but both sites are on the same server.. different databases but very close in the same amount of rows that are being searched.. I am caching results now on the slow site, but still I need these searches to be much faster.. Any idea whats going on here? Quote Link to comment Share on other sites More sharing options...
phpsycho Posted October 21, 2011 Author Share Posted October 21, 2011 Now that I am scraping sites at the same time that I search, it takes 13-25 seconds to search. Is there a way I can up the memory for mysql or something? It still leaves the question of why my other site is real fast at searching and this one is very slow though... I shouldn't have to up the memory on mysql... Quote Link to comment Share on other sites More sharing options...
fenway Posted October 22, 2011 Share Posted October 22, 2011 Show us EXPLAIN otuput. Quote Link to comment Share on other sites More sharing options...
phpsycho Posted October 22, 2011 Author Share Posted October 22, 2011 id select_type table type possible_keys key key_len ref rows Extra 1 SIMPLE search_links ALL NULL NULL NULL NULL 208091 Using where if this isn't fixable even though it should be.. would using a sql flat file system be faster/better? I been searching around and can only find one and it doesn't offer MATCH() AGAINST(), just LIKE. Quote Link to comment Share on other sites More sharing options...
fenway Posted October 23, 2011 Share Posted October 23, 2011 Then you don't have a FULLTEXT index in place. In fact, I don't even see a single key. Post the SHOW CREATE TABLE. Quote Link to comment Share on other sites More sharing options...
phpsycho Posted October 24, 2011 Author Share Posted October 24, 2011 oops, I had to wipe my server out and start over again and I must have used a older backup so I didn't fulltext some columns. I fixed it though and still its slow, takes about 10 seconds to load a page using that query I posted before. CREATE TABLE IF NOT EXISTS `search_links` ( `id` bigint(255) NOT NULL AUTO_INCREMENT, `url` varchar(255) COLLATE utf8_unicode_ci NOT NULL, `datetime` datetime NOT NULL DEFAULT '0000-00-00 00:00:00', `title` varchar(155) COLLATE utf8_unicode_ci NOT NULL, `keywords` text COLLATE utf8_unicode_ci NOT NULL, `description` text COLLATE utf8_unicode_ci NOT NULL, `realdesc` int(1) NOT NULL DEFAULT '0', `realkey` int(1) NOT NULL DEFAULT '0', `status` int(255) NOT NULL DEFAULT '0', `pic_status` int(1) NOT NULL DEFAULT '0', PRIMARY KEY (`id`), UNIQUE KEY `url` (`url`), FULLTEXT KEY `keywords` (`keywords`), FULLTEXT KEY `title` (`title`), FULLTEXT KEY `description` (`description`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=211882 ; I'm shooting for at most 1-2 seconds Quote Link to comment Share on other sites More sharing options...
fenway Posted October 24, 2011 Share Posted October 24, 2011 That's not how full-text works. If you want to search across columns, then you need to create a full-text index across those same columns. Those individual column full-text indexes aren't going to be used for the query provided -- there's no "index merge" for full-text. Quote Link to comment Share on other sites More sharing options...
phpsycho Posted October 24, 2011 Author Share Posted October 24, 2011 Ah okay! CREATE TABLE IF NOT EXISTS `search_links` ( `id` bigint(255) NOT NULL AUTO_INCREMENT, `url` varchar(255) COLLATE utf8_unicode_ci NOT NULL, `datetime` datetime NOT NULL DEFAULT '0000-00-00 00:00:00', `title` varchar(155) COLLATE utf8_unicode_ci NOT NULL, `keywords` text COLLATE utf8_unicode_ci NOT NULL, `description` text COLLATE utf8_unicode_ci NOT NULL, `realdesc` int(1) NOT NULL DEFAULT '0', `realkey` int(1) NOT NULL DEFAULT '0', `status` int(255) NOT NULL DEFAULT '0', `pic_status` int(1) NOT NULL DEFAULT '0', PRIMARY KEY (`id`), UNIQUE KEY `url` (`url`), FULLTEXT KEY `title` (`title`,`keywords`,`description`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=211882 ; ? I did that and now it loads around 1.8 seconds .. thats still a little high though.. any way I could lower that loading time? Will it become slower and slower the more rows I add? I'm sorta new to mysql, but been using it for a few years.. just never did anything with searches so I'm not real sure about this stuff. btw thank you very much! lol it was a simple task, but its been a problem for a while now and just never realized how to fix it. Quote Link to comment Share on other sites More sharing options...
phpsycho Posted October 24, 2011 Author Share Posted October 24, 2011 oh wait.. now it looks like its loading at like half a second! but.. I think thats because of the mysql caching.. How can I clear that? is it a good idea to keep it on even though I am using my own caching method? or which is better do you think? lol sorry for the 20 questions just making sure this thing is perfect. Having it working is good, but whats really important is having the server work load a whole lot less. Quote Link to comment Share on other sites More sharing options...
fenway Posted October 24, 2011 Share Posted October 24, 2011 Yes, mysql will cache indexes -- and there are ways to "warm" the cache. Quote Link to comment Share on other sites More sharing options...
The Little Guy Posted October 25, 2011 Share Posted October 25, 2011 If your trying to build a search engine, I would recommend looking into Sphinx: http://sphinxsearch.com/ Full Text searches on one server are extremely slow when you start to get a large database. Splitting the database up on multiple servers may speed it up though. Quote Link to comment Share on other sites More sharing options...
phpsycho Posted October 25, 2011 Author Share Posted October 25, 2011 I checked out Sphinx before, but rather stick to something more simple. So doing a MySQL db cluster would speed things up? What about changing memory allowed in MySQL just for the time being? I have searched around, but dunno exactly what to change inside /etc/mysql/my.cnf to higher the allowed memory. I changed things like: key_buffer = 16M max_allowed_packet = 16M and upping that didn't do anything, even after I restarted MySQL. I do think I need to up the memory allowed because when I am scraping sites for data and inserting all that data into my db while I search, all the search queries take up to 5-7 seconds. When I'm not scraping it takes half a second to 1. Quote Link to comment Share on other sites More sharing options...
The Little Guy Posted October 25, 2011 Share Posted October 25, 2011 It looks/sounds like your spider/scraping script is attacking your database with either too many queries too fast or too many un-optimized queries. Sphinx has a MySQL storage engine that you can add to MySQL this will allow you to do MySQL queries using sphinx. Just FYI. I had a search engine that could search 150,000 rows in about 0.02 seconds, and this was using Sphinx. I know 150k rows isn't that much, but compared to MySQL Full Text, which did that in about 2-4 seconds, all using the same hardware and MySQL configuration. If you plan on using Full Text to make a search engine, your search engine will crash and burn, don't think you will become the new Google using Full Text Searches. Quote Link to comment Share on other sites More sharing options...
phpsycho Posted October 25, 2011 Author Share Posted October 25, 2011 Hmmm I suppose... lol. I like to stick with my own scripts and stuff though.. so I can use my own scrapers to insert data to mysql? and to pull search results I use some sort of php class or something? I looked at their site, but can't really figure out exactly what to do.. Off topic, but if you know could you PM me about this? I have a dynamic ip and I'm using ddclient and zoneedit.com. How do I auto update my port forwarding info in my router with my local ip that keeps changing? Currently all ddclient is doing is auto updating my domain on zoneedit to point to the current local ip... so people out of my network can't access my website. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.