Jump to content

traffic problem


mickisaddy

Recommended Posts

I have been suspended by my host because of high mysql usage.  I had a recent jump in traffic and most of my pages are dynamic pulling from a mySQL database.  What would be the best way to handle this?  Their support said something about supercache.  I was reading about this, but only on wordpress.  I could create static pages (the pages only change when I update the database).  Is there anything else that I can do?

 

Thanks

Sean

Link to comment
Share on other sites

Let me add a little more information.  My site is a sports card website.  Total database records are ~1,000,000 records with about 10-15% of those getting the most queries.  My traffic went from about 700-800 page views a day to 1700-2200 page views a day ( ;) I know it's a problem that everyone wishes they had)  Each page pulls 100 records from the database.

 

Right now I am not making enough money off the website to pay $50-$100 a month for hosting.  I am poor and putting my wife through college right now.

 

Thanks for any help anyone can give me.

Sean

Link to comment
Share on other sites

SOme code would be nice, however, I do website design, I have 12 domains and 65+ subdomains, and I have never come close having a "MySQL overdose" :).  If you are interested, I would try 1and1.  They have very good pricing, and even better customer support.  (If you do, could you use my link, when you sign up, you get points if you refer somebody).  Anyway, efficiency and security is one of my specialties, so again, some code will be useful.  (I know, I am all over the place, sorry)

Link to comment
Share on other sites

Here is one of the queries from one of my pages.  It is the one that pulls 100 records.

 

<?php
$per_page = '100';

$sql_text = ("SELECT year, cards.set_id, company, cset, card_pre, card_no, card_suf, card_id FROM cards, sets WHERE player_id = '$player_id' AND cards.set_id=sets.set_id ORDER BY year, company, cset, card_no");


if (!$page) {
    $page = 1;
}
$prev_page = $page - 1;
$next_page = $page + 1;

$query = mysql_query($sql_text, $db);

$page_start = ($per_page * $page) - $per_page;
$num_rows = mysql_num_rows($query);

if ($num_rows <= $per_page)  {
     $num_pages = 1;  }
else if (($num_rows % $per_page) == 0)  {
     $num_pages = ($num_rows / $per_page);  }
else {
     $num_pages = ($num_rows / $per_page) + 1;  }
$num_pages = (int) $num_pages;

if (($page > $num_pages) || ($page < 0))  {
     echo"You have specified an invalid page number";  }

$sql_text = $sql_text . " LIMIT $page_start, $per_page";

$query = mysql_query($sql_text);

while ($myrow = mysql_fetch_row($query)) {

     printf("<tr><td>%s</td><td><a href='/baseball/cards/setlist/%s/%s/%s/1.php'>%s %s</a></td><td>%s%s%s</td>", $myrow[0], $myrow[1], $myrow[0], $url_name, $myrow[2], $myrow[3], $myrow[4], $myrow[5], $myrow[6]);
}

 

I wrote the code years ago so it would not surprise me if it was not the most efficient way to do it, but any help would be appreciated.

 

Thanks again

Sean

Link to comment
Share on other sites

As a little more information here is what my host sent me

 

Since Dec26, this account has examined 62,888,469,411 rows (12,577,693,882/day), taking 39,855 CPU seconds (7,971/day). Anything above about 5 billion rows or 70,000 CPU seconds per day is considered excessive for a shared hosting environment. Please reduce mysql usage or find a new host. Source data from which above calculations were based... User CPU_time Rows_examined sportsc3_sportsc 39854.271678 62888466859 sportsc3_sswp01 0.000000 0 sportsc3_copp1 1.092333 2552 sportsc3 0.000000 0 sportsc3_sswp02 0.000000 0 Most referenced databases today in ~/tmp/mysql_slow_queries... 18 sportsc3_cards; The top 10 busiest tables on this account are as follows... 61837072992 sportsc3_cards.cards 435604667 sportsc3_cards.bkcards 347723674 sportsc3_cards.fbcards 162388800 sportsc3_cards.hkcards 12693974 sportsc3_cards.sets 10568018 sportsc3_cards.players 3455270 sportsc3_cards.bbcards 713707 sportsc3_cards.fbplayers 509390 sportsc3_cards.bkplayers 398162 sportsc3_cards.hkplayers

Thank you,

 

 

Link to comment
Share on other sites

 

 

Edit: that is because your code is executing the query twice and each query is joining every row in each table to every row in the other other table -

 

There are two immediate problems -

 

1) You are joining each and every row in cards with each and every row in sets in the following "FROM cards, sets"

 

2) And you are executing that query twice. The first time just to find out if there are more than 100 rows in the result set.

 

To fix the first problem, you need to do the correct type of join (depends on what your table definitions are, posting them would help) with an ON/USING clause instead of putting the join conditions in the WHERE clause.

 

To fix the second problem, you need paging code that works independent of the number of rows in the table. For example, execute the query with the LIMIT clause and if the number of rows in the result set is less than 100 then you take appropriate action.

 

Link to comment
Share on other sites

Thanks for that help, I will take a look at it.

 

I think I just figured out the majority of the traffic issue, though.  I started looking through my traffic logs.  It looks like Yandex spider has been hitting my site hard recently.  It looks like it was accounting for 20-25% of my page views.  I have added it and one other spider that has been hitting quite a bit to my spiders.txt and blocked a bunch of IP addresses for those spiders.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.