Jump to content

Deleting Duplicate error


manhattes
Go to solution Solved by manhattes,

Recommended Posts

I am trying to delete all the rows that have a duplicate URL in my table. I tried this code: but it is giving me an error:

 

You can't specify target table 'TableName' for update in FROM clause

DELETE FROM TableName
WHERE  ID NOT IN (SELECT MAX(ID)
                  FROM   TableName
                  GROUP  BY Link
                  HAVING MAX(ID) IS NOT NULL) 
Link to comment
Share on other sites

Can't delete from a table and pull data from it at the same time.

 

Quick solution: use a temporary table.

CREATE TEMPORARY TABLE fooTableName LIKE TableName;
INSERT INTO fooTableName SELECT * FROM TableName;

DELETE FROM TableName WHERE ID NOT IN (SELECT MAX(ID) FROM fooTableName GROUP BY Link HAVING MAX(ID) IS NOT NULL);

DROP TEMPORARY TABLE fooTableName;
Not quite as nice if TableName has a lot of rows. Definitely not as nice if bad data is still being added into TableName (and if so then you need to fix that before doing this).

 

Side comment: HAVING MAX(ID) IS NOT NULL only matters if your ID can be null. Surely something called "ID" cannot be null, right?

Link to comment
Share on other sites

  • Solution

Thanks it is slow but worked. my script pulls in an RSS feed and sometimes pulls in data that was already pulled in.

$citable= "INSERT INTO TableName (Company, Form, Date, Link) 
VALUES ('" . $FnumI ."',' " . $Form ."','" . $msqldate ."','" . $Flink ."')
ON DUPLICATE KEY UPDATE
Company='" . $FnumI ."', Form='" . $Form ."', Date='" . $msqldate ."', Link='" . $Flink ."'";	

Is there a way I can only record the data if Link does not exist?

Yes there is always an id.

 

Can't delete from a table and pull data from it at the same time.

Quick solution: use a temporary table.

CREATE TEMPORARY TABLE fooTableName LIKE TableName;
INSERT INTO fooTableName SELECT * FROM TableName;

DELETE FROM TableName WHERE ID NOT IN (SELECT MAX(ID) FROM fooTableName GROUP BY Link HAVING MAX(ID) IS NOT NULL);

DROP TEMPORARY TABLE fooTableName;
Not quite as nice if TableName has a lot of rows. Definitely not as nice if bad data is still being added into TableName (and if so then you need to fix that before doing this).

Side comment: HAVING MAX(ID) IS NOT NULL only matters if your ID can be null. Surely something called "ID" cannot be null, right?

 

Link to comment
Share on other sites

Thanks it is slow but worked. my script pulls in an RSS feed and sometimes pulls in data that was already pulled in.

It seemed like a one-time thing; if not then there are larger problems that need to be addressed.

 

Is there a way I can only record the data if Link does not exist?

If Link is NULL then the GROUP BY won't work. That's what you were using to de-duplicate. So what do you mean if it "does not exist"?
Link to comment
Share on other sites

It seemed like a one-time thing; if not then there are larger problems that need to be addressed.

 

If Link is NULL then the GROUP BY won't work. That's what you were using to de-duplicate. So what do you mean if it "does not exist"?

The script pulls in the last 1000 results of a feed and sometimes if it updates within the 1000 results there are duplicates. The column that has the link is the best way to identify the row as a duplicate so if I already have it then I dont need to save it....

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.