Jump to content

Recommended Posts

I have a 3 production line monitoring system where each line writes either a "1" or a "0" to a database every 5 mins  ("1" = machine running, "0" = machine not running) .  This database is used to grapically represent uptime and down time by means of Jpgraph (PHP graphing class library).  7 days worth of daily data in a bargraph.

I have a main page which calls the database query/data collection/graphing script as an image 3 times, once for each line. i.e <img src='graph_script.php?line=line1' border='0' alt='' class='graph'/>

 

My trouble is the time taken to return results, up to 3-5 minutes sometimes.  I have tested when no other queries are running and various times, this makes no difference. Table is not very well structured but this is outwith my control. There is an index on the table (dateandtime). CPU useage is 20% througout.

 

The funny thing is, when I only call the image script once from the main page, it takes only 3-4 secs.  Two or more calls to the script from the main page and this delay rears it's ugly head!

 

I would of thought that if the script takes only 4 secs on it's own, then to call it 3 times from another script it would take approx 12-30 secs perhaps?.  This is acceptable as there is quite alot of data to search for and gather. (as in most data logging environments)

 

Please help!

Perhaps it's my SQL? perhaps it's my PHP

Are there known issues performing multiple queries this way on the same table?

 

[attachment deleted by admin]

Link to comment
https://forums.phpfreaks.com/topic/204890-mysql-slowdown-long-query-time/
Share on other sites

Forgot to mention Stats

 

IBM x3650 series server

Dual Intel Xeon 1.86GHz quad core processors

4GB RAM

Dual Gigabit ethernet network cards

576GB Hard disk storage - RAID level 5 with hot spare

Operating System - FreeBSD 6.3

Other - Apache2 webserver, PHP5 and MySQL5

 

Lack of resources ruled out?

Sorry guys,

More info the better eh! (having read the "the rules") there for a reason I guess!!

 

PHP Version 5.2.5

Mysql 5.0.45

 

CREATE TABLE `data`.`downtime` (

`DateAndTime` timestamp NOT NULL default '0000-00-00 00:00:00' ON UPDATE CURRENT_TIMESTAMP ,

`Millitm` smallint( 6 ) NOT NULL ,

`TagName` varchar( 255 ) COLLATE utf8_unicode_ci NOT NULL ,

`Val` double( 10, 3 ) NOT NULL ,

`Status` varchar( 1 ) COLLATE utf8_unicode_ci NOT NULL ,

`Marker` varchar( 1 ) COLLATE utf8_unicode_ci NOT NULL ,

KEY `DateAndTime` ( `DateAndTime` )

) ENGINE = MYISAM DEFAULT CHARSET = utf8 COLLATE = utf8_unicode_ci

Noobs eh!! Please don't get annoyed at my noobness!! I kinda thought I put quite alot of info in!!

 

    * your MySQL server version -- absolutely required!

as detailed previousley - 5.0.45

    * the raw MySQL statement in question [in a CODE block, and without any PHP variables]

SELECT Val, COUNT(Val) FROM Downtime WHERE Val LIKE '1%' AND DateAndTime LIKE '$qs%' AND TagName LIKE '%$line%' GROUP BY Val

    * any errors that MySQL returns to the client [from mysql_error()]

NONE

    * the table structure & column indexes of the relevant tables [via SHOW CREATE TABLE is preferred]

AS Detailed above!

    * the EXPLAIN output for your query, if applicable

I'm not really sure bout this function! in the process of finding out! Is this ok? >

id select_type table type possible_keys key key_len ref rows Extra

1 SIMPLE Downtime ALL DateAndTime NULL NULL NULL 322580 Using where; Using temporary; Using filesort

    * a clear and concise description of what you want this statement to achieve

The query is basically designed to count the occurances of a "1" in the "Val" field for a particular "Tagname" on a particular day "DateAnd Time".  Results are returned as an 7 element array of integers, each element  for the sum of occurances of "1" for each of the last seven days.

    * a description of what it's currently doing that's not to your liking

Query rather quick if called just once. Very slow if concurrent queries are executed.

    * a brief listing of the types of things you've attempted so far

I have tried various fine tunings of the PHP query to no avail, to many to mention! I have tried running the 3 concurrent queries on 3 different "identical" newly created (duplicated) tables; does this rule out "locking issues"?

I have tried converting the table to Innodb, this seems to make no difference.

Some advice on troubleshooting Mysql queries would be cool!  I cannot seem to find my my.conf file anywhere on the system to check some other things suggested elsewhere or to enable long queries logging.  I don't think this config file is used, I did not set the system up myself and the guy who did has left the building!!

 

Colsieb

Thanks Fenway,

I did have an index on the DateAndTime column.

I added an index for the tagname, a little improvement with a cardinality of 4 (4 different variations of tagname accross all 500,000+ rows).

I added an index for the val, a little improvement with a cardinality of 2 (4 different values accross all 500,000+ rows).

This brought the time down to arount 40 secs.

Since all row entries are very similar, I think that Indexes are not a huge advantage.

 

I have figured out that since I only ever need to search the last 5000 or so rows each time, I have added an autoincrement "ID" field and narrow down my query to search xxxxx WHERE ID > (num rows - 5000) xxxxxxx.

This has reduced the query time down to around 4 secs.

 

Solved

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.