Jump to content

fenway

Staff Alumni
  • Posts

    16,168
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by fenway

  1. That's a tricky query... group by / order by / limit don't really work well when you're trying to get the max/top N amongst a group within a table. The only workaround that I can think of at the moment is to "ORDER BY liGPSID ASC, dtTime DESC", and then -- using user variables -- count to 3, and make a new expression column in the select list with a 0 or 1, depending on if you've hit 3 or not. Then, use the HAVING clause to "filter" the results. Not pretty.
  2. Yes, that's what you'd need in the on clause... as for the same # of players, you'd need to check if p1 and/or p2 (possibly just p2 if your logic dictates that) is empty/null and add this expression to your on clause as well.
  3. Well, you can INSERT ... ON DUPLICATE KEY UPDATE... if you define a unique index on IP. Yes, you will have many rows... in principle, you could create a trigger to update a summary table; in terms of performance, additional rows shouldn't really matter as long as you have an index.
  4. Well, if you simply log each hit, it's easy to group by the user/ip/whatever and get a real-time count...
  5. Why update the count in real-time? Why not just query it?
  6. I'm just reading this now... could you restate your question?
  7. I still prefere the ON clause to USING... there is a different for SELECT * expansion.
  8. fenway

    overwrite

    Why?
  9. Not a link per se, although I'm sure there are some... just that you can make your own column full of relevant keywords, and search that...
  10. How was this solved?
  11. You can also check that the old table's p1 / p2 does not match the new one's.
  12. You're stuck, unless you use a third-party solution or make your own index... you could ways check the keyword length and decide how to query as a workaround.
  13. Accuracy? What are you talking about? The php equivalent is slower and worse... definitely not more accurate.
  14. I wish I could be of more help... but if the same commands work from another script / phpmyadmin / CLI, then it's not the db per se.
  15. I suppose you could run a query using a bunch of user variables to find out how many records are present for each user, then "mark" the fourth, then restrict with a HAVING clause, and then join *that* table in...
  16. You could use INSERT INTO... SELECT WHERE... to "duplicate"... just copy whatever values to want in the select, and add string values for anything else, e.g.: INSERT INTO yourTable ( col1, col2, col3, col4) SELECT 'newcol1value', col2, col3, 'newcol4value' FROM yourTable WHERE uid = <the-source-record's-uid>
  17. Maybe there's a way in v5 with information_schema tables, but I don't know for sure.
  18. I still say it's running twice... you can turn on loggin and check.
  19. Did this always work? I don't know that I've ever seen it.
  20. nobody?
  21. I guess you could have a SUM with an IF () condition for each one.
  22. Because...?
  23. You really should be using INTERVAL 7 DAY for this.
  24. You can't get a duplicate error on the PK unless there are two such id.
  25. Then add this to the join condition.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.