mickisaddy Posted January 3, 2009 Share Posted January 3, 2009 I have been suspended by my host because of high mysql usage. I had a recent jump in traffic and most of my pages are dynamic pulling from a mySQL database. What would be the best way to handle this? Their support said something about supercache. I was reading about this, but only on wordpress. I could create static pages (the pages only change when I update the database). Is there anything else that I can do? Thanks Sean Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/ Share on other sites More sharing options...
laPistola Posted January 3, 2009 Share Posted January 3, 2009 Semi dedicated server or higher could be a route, if your pages really on you to update a database before they update its still pulling data from the DB on every page load so its still dynamic! Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728695 Share on other sites More sharing options...
mickisaddy Posted January 3, 2009 Author Share Posted January 3, 2009 Let me add a little more information. My site is a sports card website. Total database records are ~1,000,000 records with about 10-15% of those getting the most queries. My traffic went from about 700-800 page views a day to 1700-2200 page views a day ( I know it's a problem that everyone wishes they had) Each page pulls 100 records from the database. Right now I am not making enough money off the website to pay $50-$100 a month for hosting. I am poor and putting my wife through college right now. Thanks for any help anyone can give me. Sean Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728706 Share on other sites More sharing options...
corbin Posted January 3, 2009 Share Posted January 3, 2009 Sounds like lots and lots of caching would be your best bet, unless you want to upgrade servers, like previously suggested. Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728727 Share on other sites More sharing options...
PFMaBiSmAd Posted January 3, 2009 Share Posted January 3, 2009 You need to make sure your existing queries are as efficient as possible (for all we know you are doing a query in a loop 100 times) and then use a cache to buffer the most popular results. Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728739 Share on other sites More sharing options...
mickisaddy Posted January 3, 2009 Author Share Posted January 3, 2009 I am running a query to get 100 rows and then looping to output the result (ie mysql_fetch_row). Is there a more efficient way to do this? Sean Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728749 Share on other sites More sharing options...
Nitroware Posted January 3, 2009 Share Posted January 3, 2009 SOme code would be nice, however, I do website design, I have 12 domains and 65+ subdomains, and I have never come close having a "MySQL overdose" . If you are interested, I would try 1and1. They have very good pricing, and even better customer support. (If you do, could you use my link, when you sign up, you get points if you refer somebody). Anyway, efficiency and security is one of my specialties, so again, some code will be useful. (I know, I am all over the place, sorry) Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728758 Share on other sites More sharing options...
mickisaddy Posted January 3, 2009 Author Share Posted January 3, 2009 Here is one of the queries from one of my pages. It is the one that pulls 100 records. <?php $per_page = '100'; $sql_text = ("SELECT year, cards.set_id, company, cset, card_pre, card_no, card_suf, card_id FROM cards, sets WHERE player_id = '$player_id' AND cards.set_id=sets.set_id ORDER BY year, company, cset, card_no"); if (!$page) { $page = 1; } $prev_page = $page - 1; $next_page = $page + 1; $query = mysql_query($sql_text, $db); $page_start = ($per_page * $page) - $per_page; $num_rows = mysql_num_rows($query); if ($num_rows <= $per_page) { $num_pages = 1; } else if (($num_rows % $per_page) == 0) { $num_pages = ($num_rows / $per_page); } else { $num_pages = ($num_rows / $per_page) + 1; } $num_pages = (int) $num_pages; if (($page > $num_pages) || ($page < 0)) { echo"You have specified an invalid page number"; } $sql_text = $sql_text . " LIMIT $page_start, $per_page"; $query = mysql_query($sql_text); while ($myrow = mysql_fetch_row($query)) { printf("<tr><td>%s</td><td><a href='/baseball/cards/setlist/%s/%s/%s/1.php'>%s %s</a></td><td>%s%s%s</td>", $myrow[0], $myrow[1], $myrow[0], $url_name, $myrow[2], $myrow[3], $myrow[4], $myrow[5], $myrow[6]); } I wrote the code years ago so it would not surprise me if it was not the most efficient way to do it, but any help would be appreciated. Thanks again Sean Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728776 Share on other sites More sharing options...
mickisaddy Posted January 3, 2009 Author Share Posted January 3, 2009 As a little more information here is what my host sent me Since Dec26, this account has examined 62,888,469,411 rows (12,577,693,882/day), taking 39,855 CPU seconds (7,971/day). Anything above about 5 billion rows or 70,000 CPU seconds per day is considered excessive for a shared hosting environment. Please reduce mysql usage or find a new host. Source data from which above calculations were based... User CPU_time Rows_examined sportsc3_sportsc 39854.271678 62888466859 sportsc3_sswp01 0.000000 0 sportsc3_copp1 1.092333 2552 sportsc3 0.000000 0 sportsc3_sswp02 0.000000 0 Most referenced databases today in ~/tmp/mysql_slow_queries... 18 sportsc3_cards; The top 10 busiest tables on this account are as follows... 61837072992 sportsc3_cards.cards 435604667 sportsc3_cards.bkcards 347723674 sportsc3_cards.fbcards 162388800 sportsc3_cards.hkcards 12693974 sportsc3_cards.sets 10568018 sportsc3_cards.players 3455270 sportsc3_cards.bbcards 713707 sportsc3_cards.fbplayers 509390 sportsc3_cards.bkplayers 398162 sportsc3_cards.hkplayers Thank you, Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728792 Share on other sites More sharing options...
PFMaBiSmAd Posted January 3, 2009 Share Posted January 3, 2009 Edit: that is because your code is executing the query twice and each query is joining every row in each table to every row in the other other table - There are two immediate problems - 1) You are joining each and every row in cards with each and every row in sets in the following "FROM cards, sets" 2) And you are executing that query twice. The first time just to find out if there are more than 100 rows in the result set. To fix the first problem, you need to do the correct type of join (depends on what your table definitions are, posting them would help) with an ON/USING clause instead of putting the join conditions in the WHERE clause. To fix the second problem, you need paging code that works independent of the number of rows in the table. For example, execute the query with the LIMIT clause and if the number of rows in the result set is less than 100 then you take appropriate action. Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728804 Share on other sites More sharing options...
mickisaddy Posted January 3, 2009 Author Share Posted January 3, 2009 Thanks for that help, I will take a look at it. I think I just figured out the majority of the traffic issue, though. I started looking through my traffic logs. It looks like Yandex spider has been hitting my site hard recently. It looks like it was accounting for 20-25% of my page views. I have added it and one other spider that has been hitting quite a bit to my spiders.txt and blocked a bunch of IP addresses for those spiders. Quote Link to comment https://forums.phpfreaks.com/topic/139314-traffic-problem/#findComment-728822 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.