Jump to content

phpsycho

Members
  • Posts

    164
  • Joined

  • Last visited

    Never

About phpsycho

  • Birthday 02/22/1994

Contact Methods

  • Website URL
    http://adamlacombe.com
  • Yahoo
    webdev204

Profile Information

  • Gender
    Male
  • Location
    Randolph, Vermont

phpsycho's Achievements

Member

Member (2/5)

0

Reputation

  1. Okay so when I "like" or I paste a url linking to my blog on Facebook it scrapes content off that page.. at least it should.. I have their meta tags for the image it displays, the description, etc but for some odd reason if I paste a link to a blog post when the little box pops up below it displays content from the main page and it also is the index url of the site. Any idea why that may be? I was thinking it could have something to do with Cloud Flare (http://cloudflare.com/) maybe.. what do you guys think?
  2. Hmmm I suppose... lol. I like to stick with my own scripts and stuff though.. so I can use my own scrapers to insert data to mysql? and to pull search results I use some sort of php class or something? I looked at their site, but can't really figure out exactly what to do.. Off topic, but if you know could you PM me about this? I have a dynamic ip and I'm using ddclient and zoneedit.com. How do I auto update my port forwarding info in my router with my local ip that keeps changing? Currently all ddclient is doing is auto updating my domain on zoneedit to point to the current local ip... so people out of my network can't access my website.
  3. I checked out Sphinx before, but rather stick to something more simple. So doing a MySQL db cluster would speed things up? What about changing memory allowed in MySQL just for the time being? I have searched around, but dunno exactly what to change inside /etc/mysql/my.cnf to higher the allowed memory. I changed things like: key_buffer = 16M max_allowed_packet = 16M and upping that didn't do anything, even after I restarted MySQL. I do think I need to up the memory allowed because when I am scraping sites for data and inserting all that data into my db while I search, all the search queries take up to 5-7 seconds. When I'm not scraping it takes half a second to 1.
  4. oh wait.. now it looks like its loading at like half a second! but.. I think thats because of the mysql caching.. How can I clear that? is it a good idea to keep it on even though I am using my own caching method? or which is better do you think? lol sorry for the 20 questions just making sure this thing is perfect. Having it working is good, but whats really important is having the server work load a whole lot less.
  5. Ah okay! CREATE TABLE IF NOT EXISTS `search_links` ( `id` bigint(255) NOT NULL AUTO_INCREMENT, `url` varchar(255) COLLATE utf8_unicode_ci NOT NULL, `datetime` datetime NOT NULL DEFAULT '0000-00-00 00:00:00', `title` varchar(155) COLLATE utf8_unicode_ci NOT NULL, `keywords` text COLLATE utf8_unicode_ci NOT NULL, `description` text COLLATE utf8_unicode_ci NOT NULL, `realdesc` int(1) NOT NULL DEFAULT '0', `realkey` int(1) NOT NULL DEFAULT '0', `status` int(255) NOT NULL DEFAULT '0', `pic_status` int(1) NOT NULL DEFAULT '0', PRIMARY KEY (`id`), UNIQUE KEY `url` (`url`), FULLTEXT KEY `title` (`title`,`keywords`,`description`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=211882 ; ? I did that and now it loads around 1.8 seconds .. thats still a little high though.. any way I could lower that loading time? Will it become slower and slower the more rows I add? I'm sorta new to mysql, but been using it for a few years.. just never did anything with searches so I'm not real sure about this stuff. btw thank you very much! lol it was a simple task, but its been a problem for a while now and just never realized how to fix it.
  6. oops, I had to wipe my server out and start over again and I must have used a older backup so I didn't fulltext some columns. I fixed it though and still its slow, takes about 10 seconds to load a page using that query I posted before. CREATE TABLE IF NOT EXISTS `search_links` ( `id` bigint(255) NOT NULL AUTO_INCREMENT, `url` varchar(255) COLLATE utf8_unicode_ci NOT NULL, `datetime` datetime NOT NULL DEFAULT '0000-00-00 00:00:00', `title` varchar(155) COLLATE utf8_unicode_ci NOT NULL, `keywords` text COLLATE utf8_unicode_ci NOT NULL, `description` text COLLATE utf8_unicode_ci NOT NULL, `realdesc` int(1) NOT NULL DEFAULT '0', `realkey` int(1) NOT NULL DEFAULT '0', `status` int(255) NOT NULL DEFAULT '0', `pic_status` int(1) NOT NULL DEFAULT '0', PRIMARY KEY (`id`), UNIQUE KEY `url` (`url`), FULLTEXT KEY `keywords` (`keywords`), FULLTEXT KEY `title` (`title`), FULLTEXT KEY `description` (`description`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=211882 ; I'm shooting for at most 1-2 seconds
  7. Well I am using MySQL currently, but when I search anything it takes about 18 seconds for the results to load lol and I am not too enthused about that. Soo I tried doing this method and its super fast soo I thought I would stick with it. But I suppose I could loop through all lines to search.. just have more than one file with data in it..(each file will contain 3 million lines of data) so some how search through the first file and when I get to the last 'page' have it switch files and so on? Is that a good idea you think? and any idea of a simple way to code that?
  8. I'm working on a flat file database that I can search. I got most of it done, but wondering how I can paginate the results.. <?php ini_set('display_errors', "On"); $count = 0; $time_start = microtime(true); $handle = fopen('/var/txtdb/links.txt', "r"); if ($handle) { while (!feof($handle) && $count < 15) { $line = fgets($handle, "4096"); $pos = strpos($line, "keyword"); if ($pos == true) { preg_match('~\[desc = ([^\]]*)\]~is', $line, $desc); echo "$desc[1]<br>"; $count++; } } fclose($handle); } $time_end = microtime(true); $time = $time_end - $time_start; echo '<br>Script took '.$time.' seconds to execute'; ?> So I want to display 15 results per page.. I know I could just loop through all the results each page just start from the last result +15 more. but.. that would take too long considering the amount of data I have. So is there a way to start the loop at a specific line? That way I won't have to loop through them all and waste time and slow things down..
  9. id select_type table type possible_keys key key_len ref rows Extra 1 SIMPLE search_links ALL NULL NULL NULL NULL 208091 Using where if this isn't fixable even though it should be.. would using a sql flat file system be faster/better? I been searching around and can only find one and it doesn't offer MATCH() AGAINST(), just LIKE.
  10. Now that I am scraping sites at the same time that I search, it takes 13-25 seconds to search. Is there a way I can up the memory for mysql or something? It still leaves the question of why my other site is real fast at searching and this one is very slow though... I shouldn't have to up the memory on mysql...
  11. Okay so I have two different sites, both search engines though. They both have around the same amount of rows in their databases. One finds results super fast like within a half a second. The other one takes 4-8 seconds to search. I use the same method on both the sites though.. I just don't get it.. $search = "SELECT * FROM `search_links` WHERE MATCH(`description`,`keywords`,`title`) AGAINST ('$keyword $banned' IN BOOLEAN MODE) ORDER BY `realdesc` DESC, `realkey` DESC LIMIT $skip, 15"; So i'm not sure what is going on really.. but both sites are on the same server.. different databases but very close in the same amount of rows that are being searched.. I am caching results now on the slow site, but still I need these searches to be much faster.. Any idea whats going on here?
  12. Must have been because I wasn't actually accessing the page, I was using cURL to access the page to set the cookie. Just have to find a different way of doing it.
  13. Well I am trying to set a cookie, but it won't set. I know how to set them, I have done it before. I read the header data and the cookie is being set, but when I go view the cookie its not there. I tried on my site, and then the apps site. Both don't work. I'll research more on http requests though, thanks
  14. Okay I will change things over to that hashing script. Looks to be a lot more secure. Thanks. I will go with the idea of using the cookie with user id and hashed pass. But.. I tried to set the cookie using my website and the cookie is being placed for the devs site. So this is what it looks like: setcookie("awp", "$userid~|~$pass", "0", "/", ".devsite.com"); and it won't set the cookie. Can you set a cookie for a different website other than the one you're using to execute it with?
  15. oh didn't know that. Well there is no better way to learn then to try and try again. So what would you suggest I do if I were to code it myself? One of my friends suggests that my site set a cookie on the app website containing the userid and the users md5 pass and check if they match in my db then release users info.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.