Jump to content

fenway

Staff Alumni
  • Posts

    16,168
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by fenway

  1. You need to consult some basic tutorials on mysql_query() usage.
  2. First, don't use all caps. Second, try actually checking the error. Third, consult the refman for proper select syntax -- you can't put the where clause after GROUP BY.
  3. utf8_bin = binary, not text. That's all.
  4. You're comparing a field value to itself?!?!
  5. The number of records has nothing to do with which records are considered -- so I don't understand your question.
  6. Stop -- don't ever use *, throw away all of the data, and just get a count. That's what COUNT(*) is for.
  7. What's not working?
  8. If you want a great read, see here; but that's a lot to digest. Ignoring efficiency for a moment, first start by writing a simple query -- with a gROUP BY -- to get the most recent uid for each topic, by date.
  9. I mean those queries are fairly rudimentary, and I don't see how they couldn't work. So I suspect there's a php issue lurking.
  10. So many things. First, you need a left join if you want to find non-matching rows. Second, a count(*) without a group by will always return just one row. Third, DISTINCT is not a function.
  11. Try sphinx -- horribly annoying to set up and configure, and the documentation is very dry, but it's super-fast.
  12. Well, the issue is that there are still 40K rows to examine. Don't you necessarily have the siteID in the users table? If so, skip the join to the site table at the beginning.
  13. You'll have to check both conditions.
  14. TLDR... sounds like you have logic issues, not just mysql problems.
  15. Then you haven't looked enough. DELETE u.*, c.*, c2.*, t.* FROM users u LEFT JOIN chats c ON c.userId = $row['id'] LEFT JOIN chats c2 ON c.randomUserId = $row['id'] LEFT JOIN typing t ON t.id = $row['id'] WHERE u.userId = $row['id']
  16. Not sure about the details, but I'm sure they've published extensively. But I'll tell you what I tell everyone else -- you're not going to get to craiglist's size using information you obtain on a free forum. So I wouldn't worry about that limitation until you get there. As for mysql crashing & performance, everything's interconnected -- but table size is unlikely the root cause.
  17. Millions? That's nothing -- see here.
  18. I'm confused, reading over this thread -- don't you start with a bookmark? Why the left join?
  19. Forget about getting the rows back to php -- that's a different issue entirely. We need to focus on getting back the distinct list. And I have a hard time believing that it takes the same time to send the rows.
  20. Easier to handle this in php -- but meaningless without an order by clause.
  21. It just means to have to scan O**2 rows.
  22. Of course, that's horrible for index usage.
  23. If you have a php problem, try a php solution -- dump $row.
  24. Don't even dream of denormalizing the table -- that's a bad crutch. How fast it is without the GROUP BY?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.