drbigfresh Posted April 15, 2007 Share Posted April 15, 2007 I need to insert 20 rows every 30 seconds, and I need to make sure that what I am inserting is unique... Luckily each piece of data has a unique GUID I can check against. I know that I can check to see if there are duplicates by doing a select while I am doing the insert, but as the data grows, I imagine it will prove to be inefficient. I was thinking I could make the GUID field in the database a guid or primary key, and then the database would throw an exception when I was doing the insert (if there was a duplicate)...I would then just need to find a way to trap the error so the rest of the insert can continue... Anyone have an opinion on any of this or some guidance? Thanks! Scott. Link to comment https://forums.phpfreaks.com/topic/47127-best-practices-on-insert/ Share on other sites More sharing options...
bubblegum.anarchy Posted April 16, 2007 Share Posted April 16, 2007 This is the way I usually check for duplicates: $result = mysql_query("INSERT INTO temp SET content = 'variance'") or mysql_errno() == 1062 or tigger_error(mysql_error(), E_USER_ERROR); if (mysql_errno() == 1062) // handle duplicate ... else ... Link to comment https://forums.phpfreaks.com/topic/47127-best-practices-on-insert/#findComment-230003 Share on other sites More sharing options...
fenway Posted April 20, 2007 Share Posted April 20, 2007 How about INSERT INGORE? Link to comment https://forums.phpfreaks.com/topic/47127-best-practices-on-insert/#findComment-233937 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.