jasonc310771 Posted June 7, 2023 Share Posted June 7, 2023 The following query is not sorting the second select by timestamp. I am after the first ten results from the table based on the lastVisitDateTime, followed by the rest of the table, I used union so this removes the duplicates, but I only want the duplicates from the second select to be removed, leaving the first ten untouched. SELECT * FROM `links` as l LEFT JOIN `accounts` as a ON a.id=l.author WHERE `l`.`live` = '1' ORDER BY `l`.`lastVisitDateTime` DESC LIMIT 10 UNION SELECT * FROM `links` as l2 LEFT JOIN `accounts` as a2 ON a2.id=l2.author WHERE `l2`.`live` = '1' AND `l2`.`linklocation` = 'main' ORDER BY `l2`.`timestamp` DESC; Quote Link to comment Share on other sites More sharing options...
Barand Posted June 7, 2023 Share Posted June 7, 2023 The ORDER BY at the end of a set of UNIONs applies to the whole resultset. You may need a two-query approach (no data so untested) -- QUERY 1 CREATE TABLE copylinks SELECT * FROM `links` as l LEFT JOIN `accounts` as a ON a.id=l.author WHERE `l`.`live` = '1' ORDER BY `l`.`lastVisitDateTime` DESC LIMIT 10; -- QUERY 2 INSERT INTO copylinks SELECT * FROM `links` as l LEFT JOIN `accounts` as a ON a.id=l.author LEFT JOIN copylinks l2 ON l.id = l2.id WHERE `l`.`live` = '1' AND `l`.`linklocation` = 'main' AND l2.id IS NULL ORDER BY `l`.`timestamp` DESC; -- CHECK RESULT SELECT * FROM copylinks; 1 Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 (edited) Hi Thanks, I used the query as is and get the error #1060 - Duplicate column name 'id' Edited June 8, 2023 by jasonc310771 Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 I just tried it again to see if I could figure it out myself. now I get error that the copylinks table exists? I see that it creates a new table in the database. Can this not be done without creating a new table ? Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 I have created test tables below... -- phpMyAdmin SQL Dump -- version 5.2.1 -- https://www.phpmyadmin.net/ -- -- Host: localhost:3306 -- Generation Time: Jun 08, 2023 at 01:35 AM -- Server version: 5.7.42 -- PHP Version: 8.1.16 SET SQL_MODE = "NO_AUTO_VALUE_ON_ZERO"; START TRANSACTION; SET time_zone = "+00:00"; CREATE TABLE `accounts-test` ( `id` bigint(20) NOT NULL COMMENT '2013', `username` text NOT NULL COMMENT '2013', `first_name` varchar(255) NOT NULL COMMENT '2013' ) ENGINE=MyISAM DEFAULT CHARSET=latin1 ROW_FORMAT=DYNAMIC; INSERT INTO `accounts-test` (`id`, `username`, `first_name`) VALUES (1, 'user1', 'user1'), (2, 'user2', 'user2'); CREATE TABLE `links-test` ( `id` int(11) NOT NULL, `link` text, `live` int(1) NOT NULL, `author` text, `timestamp` int(11) NOT NULL, `lastVisitDateTime` datetime NOT NULL DEFAULT '0000-00-00 00:00:00' ) ENGINE=MyISAM DEFAULT CHARSET=latin1; INSERT INTO `links-test` (`id`, `link`, `live`, `author`, `timestamp`, `lastVisitDateTime`) VALUES (1, 'https://www.example.com/1', 1, '1', 1, '2023-06-07 10:00:00'), (3, 'https://www.example.com/4', 1, '1', 3, '2023-06-07 10:00:00'), (4, 'https://www.example.com/1', 1, '2', 4, '2023-06-07 10:00:02'), (2, 'https://www.example.com/2', 1, '2', 2, '2023-06-07 10:00:01'), (5, 'https://www.example.com/3', 1, '1', 5, '2023-06-07 10:00:00'), (6, 'https://www.example.com/4', 1, '2', 6, '2023-06-07 10:00:03'), (7, 'https://www.example.com/2', 1, '1', 7, '2023-06-07 10:00:00'), (8, 'https://www.example.com/3', 1, '2', 8, '2023-06-07 10:00:04'), (9, 'https://www.example.com/1', 1, '1', 9, '2023-06-07 10:00:00'), (10, 'https://www.example.com/4', 1, '2', 10, '2023-06-07 10:00:05'), (11, 'https://www.example.com/1', 1, '1', 11, '2023-06-07 10:00:00'), (12, 'https://www.example.com/2', 1, '2', 12, '2023-06-07 10:00:06'), (13, 'https://www.example.com/3', 1, '1', 13, '2023-06-07 10:00:00'), (14, 'https://www.example.com/4', 1, '2', 14, '2023-06-07 10:00:07'), (15, 'https://www.example.com/2', 1, '1', 15, '2023-06-07 10:00:00'), (16, 'https://www.example.com/3', 1, '2', 16, '2023-06-07 10:00:08'), (17, 'https://www.example.com/2', 1, '2', 17, '2023-06-07 10:00:00'), (18, 'https://www.example.com/3', 1, '1', 18, '2023-06-07 10:00:09'), (19, 'https://www.example.com/4', 1, '2', 19, '2023-06-07 10:00:00'), (20, 'https://www.example.com/2', 1, '1', 20, '2023-06-07 10:00:10'), (21, 'https://www.example.com/3', 1, '2', 21, '2023-06-07 10:00:00'); ALTER TABLE `accounts-test` ADD PRIMARY KEY (`id`); ALTER TABLE `links-test` ADD PRIMARY KEY (`id`); ALTER TABLE `accounts-test` MODIFY `id` bigint(20) NOT NULL AUTO_INCREMENT COMMENT '2013', AUTO_INCREMENT=3; ALTER TABLE `links-test` MODIFY `id` int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=22; COMMIT; -- QUERY 1 CREATE TABLE copylinks SELECT * FROM `links-test` as l LEFT JOIN `accounts-test` as a ON a.id=l.author WHERE `l`.`live` = '1' ORDER BY `l`.`lastVisitDateTime` DESC LIMIT 10; -- QUERY 2 INSERT INTO copylinks SELECT * FROM `links-test` as l LEFT JOIN `accounts-test` as a ON a.id=l.author LEFT JOIN copylinks l2 ON l.id = l2.id WHERE `l`.`live` = '1' AND `l`.`linklocation` = 'main' AND l2.id IS NULL ORDER BY `l`.`timestamp` DESC; -- CHECK RESULT SELECT * FROM copylinks; Quote Link to comment Share on other sites More sharing options...
Barand Posted June 8, 2023 Share Posted June 8, 2023 3 hours ago, jasonc310771 said: Can this not be done without creating a new table ? You could either DROP the copylinks table when you have finished with it, or Create it as a temporary table - but then you'd need to do it twice (copylinksa and copylinksb) as you can't open a temp table twice in a query BTW, your links_test table doesn't have LINKLOCATION column. Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 Using a temp table is not an option, its on a live site that would conflict with each visit. Yes I realised that when I tested it locally, I had removed that colum in error. Quote Link to comment Share on other sites More sharing options...
Barand Posted June 8, 2023 Share Posted June 8, 2023 The duration of a temp table is that of the connection. If you create it in a php script it disappears when the script fininishes and the connection closes. Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 If I have more than one visit at the same time will it still work using the temp table ? Quote Link to comment Share on other sites More sharing options...
Barand Posted June 8, 2023 Share Posted June 8, 2023 Each connection wil have its own temp table https://dev.mysql.com/doc/refman/8.0/en/create-temporary-table.html Quote Link to comment Share on other sites More sharing options...
Barand Posted June 8, 2023 Share Posted June 8, 2023 ... which makes them the safer option Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 2 minutes ago, Barand said: Each connection wil have its own temp table https://dev.mysql.com/doc/refman/8.0/en/create-temporary-table.html Ok I understand now, I thought the table would remain and cause the other visitors at the same time viewing to get errors. Using the test query I gave above, I get the error.. CREATE TABLE copylinks SELECT * FROM `links-test` as l LEFT JOIN `accounts-test` as a ON a.id=l.author WHERE `l`.`live` = '1' ORDER BY `l`.`lastVisitDateTime` DESC LIMIT 10; MySQL said: #1060 - Duplicate column name 'id' Quote Link to comment Share on other sites More sharing options...
Barand Posted June 8, 2023 Share Posted June 8, 2023 3 hours ago, jasonc310771 said: CREATE TABLE copylinks SELECT * should be SELECT l.* Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 No, using the temp table still not going to work. If the visitor views the page with the top ten and the rest underneath on their first visit, and remain on the site to view other pages and then come back to the home page they will likely get error that the table already exists. Quote Link to comment Share on other sites More sharing options...
Barand Posted June 8, 2023 Share Posted June 8, 2023 Very unlikely. The tables only existed for the few microseconds it tok to run the script and was gone before the user viewed the results on the client. Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 OK, I'll try this for a while and see if my error log shows any issues. Thank you. Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 The table still remains and showed error that table already exists. Quote Link to comment Share on other sites More sharing options...
jasonc310771 Posted June 8, 2023 Author Share Posted June 8, 2023 Think I'll opt to do in PHP with two queries and filter out the first ten from the second results. Quote Link to comment Share on other sites More sharing options...
gizmola Posted June 22, 2023 Share Posted June 22, 2023 This type of use case is exactly why people use caching servers like redis or memcached, although you can also support the requirement, with in some cases, edge side includes or client side includes (using javascript). With something like redis, the basic idea is this: Code checks redis for existence of top 10 result. If not found, then query writes cached version with new TTL. You can use a cron job to regenerate the cache at whatever interval you want. Pretty much every site that has a massive userbase implements caching. For a single/monolithic server, PHP has long had options like the shared memory routines, or APC that can be used as well. It's also possible to generate a snippet of html in a file that you read off the file system, as a simple "roll your own" method of caching. All of these techniques reduce the load on your database, which is often the first and most difficult bottleneck to overcome when lack of scalability starts to limit performance. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.