Jump to content

Multiple MySQL Query Help Needed


andrewpike2000

Recommended Posts

I am running a website for a concert photographer.  The site has a back-end MySQL (v5.0.45) DB in place to count how many times an image is viewed, the last time it was viewed, IP and timestamp of last view, etc.  I use PHP to harness this.

 

As the site has become more and more complex, I've been finding myself with a need to bounce one query off of another to get the desired result.  That is where I'm having problems - this one transaction has been driving me bonkers.  Any help or suggestions would be appreciated.

 

My table is layed out like this:

 

photourlbandidshowidvenueiddateidpicidcountlastiptimestamp

/v3/a.php?s=1&p=1.jpg113200801011.jpg7192.168.1.12008-01-06 01:02:58

/v3/a.php?s=2&p=2.jpg124200801042.jpg5192.168.1.22008-01-05 01:17:53

[table][tr][td]

 

In this table, my KEY is `photourl` and obviously unique to each pic.

 

  • I want to use the INSERT query to log an image that hasn't been viewed before - no record pre-existing.
  • I also want to UPDATE the `count`of a record that has been viewed before (is pre-existing) with the incremented `count` , `ip`, and `timestamp`
  • But, I don't want to increment the `count` and update the `ip` or `timestamp` if the last `ip` for that record is the same as what's already present

 

The two queries I am trying to run to perform this action are:

 

// QUERY #1
$q1 = "INSERT INTO photo_view_counter (photourl,bandid,showid,venueid,dateid,picid,count,lastip) VALUES ('$_SERVER[REQUEST_URI]','$bandid','$showid','$venueid','$dateid','$picid','1','$_SERVER[REMOTE_ADDR]')";
if(!mysql_db_query("dbname",$q1,$link)) die("Query #1 Failed: ". mysql_error());

 

and...

 

// QUERY #2
$q2 = "UPDATE photo_view_counter SET count=count+1,lastip='$_SERVER[REMOTE_ADDR]' WHERE photourl='$_SERVER[REQUEST_URI]' AND lastip!='$_SERVER[REMOTE_ADDR]'";
if(!mysql_db_query("dbname",$q2,$link)) die("Query #2 Failed: ". mysql_error());

 

I've tried to combine the two queries into a single query using the following:

 

// QUERY #3
$q3 = "INSERT INTO `photo_view_counter` (`photourl`,`bandid`,`showid`,`venueid`,`dateid`,`picid`,`count`,`lastip`) VALUES ('$_SERVER[REQUEST_URI]','$bandid','$showid','$venueid','$dateid','$picid','1','$_SERVER[REMOTE_ADDR]') ON DUPLICATE KEY UPDATE `count`=`count`+1,`venueid`=$venueid,`lastip`='$_SERVER[REMOTE_ADDR]'";
if(!mysql_db_query("dbname",$q3,$link)) die("Query #3 Failed: ". mysql_error());

 

The problem is that I can't figure out how to tell query #3 to UPDATE the record's `count`, `ip`, and `timestamp` ONLY IF the `ip` is not the same as what's currently present.  In other words, I don't want to update anything if the record already exists in the table.  I want to leave the `count`, `ip`, and `timestamp` the same if the last person to view a particular image is viewing it a second time.  If someone else views that image and they go back to it again later to view it, then I want to update the `count`, `ip`, and `timestamp`.

 

I hope this makes sense.  This is my first post to this forum, by the way so I hope I included everything needed...  Thanks in advance.  By the way, the website being referenced with this post is http://www.waynedennon.com

 

 

Link to comment
Share on other sites

I was doing that originally and the table of hits data was getting huge...

 

I was thinking this new method would be more efficient than logging each hit individually.  There are nearly 20,000 photographs on the site as it is now but they're always being added.  I felt that the larger the DB became, the slower the performance would be. 

 

It just seems like incrementing the overall hit count for each photo would limit my number of rows in the DB and keep the overall performance of the site in better shape, etc.  Is this not true?

 

Do you still feel that I should log each hit individually or is there a way to do what I originally was asking, which was to increment the count only if the IP is different?

 

 

Link to comment
Share on other sites

Well, you can INSERT ... ON DUPLICATE KEY UPDATE... if you define a unique index on IP.

 

Yes, you will have many rows... in principle, you could create a trigger to update a summary table; in terms of performance, additional rows shouldn't really matter as long as you have an index.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.