Jump to content

Recommended Posts

Hello

 

I have 2 tables, 1 contains a list of existing targets, the 2nd contains a list of combats that are active.

when a new combat is created, it needs to select a target by location, that is not already fighting.

 

What I have:

Select t.* From tbl_Target t left join tbl_Combat c On t.ID = c.TargetID Where t.Delflag = 0 And t.PlayerID = "" And t.Location = "Area1" And c.Token is null Order by rand() Limit 1;

 

This works but takes to long to execute (Showing rows 0 - 0 (1 total, Query took 1.4562 sec))

Explain =>

id select_type table type possible_keys key key_len ref rows Extra

1 SIMPLE t ALL NULL NULL NULL NULL 102198 Using where; Using temporary; Using filesort

1 SIMPLE c ALL NULL NULL NULL NULL 1 Using where; Not exists

 

I figured out that the rand() is the part of the query that makes it take so long.

and I want to replace that.

 

so basicly, how?

if I count the available records first, and set a simple offset in this statement, it takes the same time.

if I just select a rand target, and then check if it's in combat with a 2nd statement (select * from tbl_Combat Where TargetID = $id), it takes longer

 

There are indexes on the primary key of each table, and an index on the foreign key in combat.

 

if more info is needed, do ask.

 

thanks for ANY help

 

Link to comment
https://forums.phpfreaks.com/topic/214813-statementprocedure-optimalisation/
Share on other sites

The first alternative that was suggested (Inner Join (max(id)*rand()) works ok, it reduced the time from 2.5-2.9 to 1.5-1.8 sec

so about 50% faster.

 

I also made sure that other procedures that called the statement got buffered if possible, reducing the time to 1.3, and I added some indexes that I saw fit (PlayerID and Location), and in the end I got the ajax call to average around 1.2 sec.

 

I would still want to increase this, but perhaps with editing the mysql config. any suggestions for this?

The ajax call that uses this query is one that is used the most of the entire application, so it's rather important to fine tune that one.

The statement selects targets by playerid = 0 and Location = AreaA

So I added an index on those 2 fields, and it increased the query speed by +- 200-300 ms

 

what would be a better index then?

Perhaps I could add an index to combat, but not sure on what fields I should put it.

Statement:

Explain Select t.* From tbl_Target t left join tbl_Combat c On t.ID = c.TargetID 
INNER JOIN (SELECT(FLOOR(max(ID) * rand())) as maxid FROM tbl_Target) tt on t.ID >= tt.maxid 
Where t.Delflag = 0 And t.PlayerID = "" And t.Location = "AreaA" And c.Token is null Limit 1

 

id select_type table type possible_keys key key_len ref rows Extra

1 PRIMARY <derived2> system NULL NULL NULL NULL 1

1 PRIMARY t ref PRIMARY,PlayerLocation PlayerLocation 27 const,const 51002 Using where

1 PRIMARY c ALL NULL NULL NULL NULL 4 Using where; Not exists

2 DERIVED tbl_Target index NULL PRIMARY 8 NULL 102198    Using index

well... in meanwhile I have put an index on that field, and another, cause the structure changed of the combat table.

it get a new Type Field, that in combination with a targetID speficies what kind of combat it is. (againts what type of target)

 

and I probably did something wrong with the first index... might have selected it to be unique or something...

 

Further more, I've been looking up some information about stored procedures.

 

Would it but advisable to make a stored procedure of that query, with only a variable for location?

Or are procedures in mysql not a good idea?

Well, let me explain what uses the statement.

 

There is a web interface where you walk around with your character, and at specific times the javascript will say, "you encounter something" and call the procedure.

 

which in turn gives back a "free" target from that area to combat against.

 

I might be the only "website" connecting to the database, but what if 50 people at once are walking around, with an encounter rate of 30%, will it make the difference then?

 

or when 100, or 500 people explore at the same time (I'm being optimistic about this).

 

My job is developing websites, and with this home project I'm trying to do as much as I can to optimise a complex procedure.

I have tested an SP, through command line, but I didn't notice much difference between executing the SP or by executing the full sql.

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.