Jump to content


  • Posts

  • Joined

  • Last visited


Everything posted by bubblegum.anarchy

  1. Then set SALE to null and use: SELECT ifnull(SALE, RETAIL) The above will return the SALE value if SALE IS NOT NULL otherwise the RETAIL value will be returned.
  2. I am a little confused... is the SALE price ever going to be greater than the RETAIL price?
  3. The query is designed for MySQL and not Oracle - the error described above is an Oracle error. The `as occurs` should following `count(*)` as an alias for what every the count value is to be used in the HAVING clause. There are four function calls in play with the following (starting from the inner most and using the programme_name example `Alvin And The Chipmunks Meet The Wolfman`): GROUP BY SOUNDEX(replace(replace(lower(programme_name), ' the', ''), ' and', '')) lower('Alvin And The Chipmunks Meet The Wolfman') converts the string to lower case resulting in 'alvin and the chipmunks meet the wolfman' replace('alvin and the chipmunks meet the wolfman', ' the', '') replaces all occurances of ' the' with an empty string and results in 'alvin and chipmunks meet wolfman' replace('alvin and chipmunks meet wolfman') replaces all occurances of ' and' with an empty string and results in 'alvin chipmunks meet wolfman' soundex('alvin and chipmunks meet wolfman') returns the string into a soundex (sounds like) character string that ignores not alphabtic characters - words like pair and pare return the same string and mayor and mare also return the same string.
  4. dstation, how does the following query do on all 8000 records? SELECT group_concat(programme_name SEPARATOR '\n') as title, count(*) AS occurs FROM movies GROUP BY SOUNDEX(replace(replace(lower(programme_name), ' the', ''), ' and', '')) HAVING occurs > 1
  5. Yeah, I know... I've also tried the query on the list provided... but I am wondering how many false positives (if any) are returned from the entire 8000. The soundex on the Addams family string is still very close.
  6. The field in mssql may have been set to # fill values, just update all those values to zero if the value is supposed to be zero.
  7. Does this return more title duplicates than just similar titles? SELECT group_concat(programme_name SEPARATOR '\n') as title, count(*) AS occurs FROM movies GROUP BY SOUNDEX(programme_name) HAVING occurs > 1
  8. Here is an option with two seperate queries (MAKE A BACKUP FIRST) The first creates a string that contains a set of all the unique ids and the second removes all the record id's that do not match have a match in the set - this way each unique copy is still kept. (IMPORTANT: all the records will be removed if @ids is empty when when the DELETE query is executed) SELECT group_concat(id SEPARATOR ',') AS ids FROM ( SELECT id FROM creature group by concat_ws(',', position_x, position_y, position_z) ) AS derived INTO @ids DELETE FROM creature WHERE NOT find_in_set(id, @ids);
  9. Looks more like query browser (excel) tuncation than any encoded value... widen the columns! I do not know of any mySQL coding that results in #'s.
  10. Just in case those field names are not just examples, try this instead: SELECT * FROM main WHERE (`field`=1 || `field`=2) && `column`=3
  11. The above form of conditional syntax should work fine in MySQL though most use `OR` and `AND` - what results are you getting?
  12. The INSERT SELECT statement, something like: mysql_query("INSERT INTO new_table SELECT * FROM old_table");
  13. 1. open vim 2. open the text file 3. type : (typing the colon should drop vim's focus the command line) 4. type %s/^\([0-9]*\)\s\(.*\)/('\1','\2'),/g 5. hit the enter key 5. swap the last comma in the text file with a semicolon 6. add the INSERT statement to the front of the entire string .... and and you have yourself a SQL query.
  14. If you are every required to add new products an INSERT INTO `CommerceInsight_dbo`.`Item` SET `rowguid` = uuid() will probably be appropriate.
  15. Try something like this: SELECT concat_ws(' ', lname, fname) AS speaker_name FROM Speakers INNER JOIN Speakpres ON Speakpres.pres_id = $pres_id AND Speakers.speak_id = Speakpres.speak_id
  16. SELECT concat(position_x, position_y, position_z) , count(concat(position_x, position_y, position_z)) as duplicate FROM creature GROUP BY concat(position_x, position_y, position_z) HAVING count(concat(position_x, position_y, position_z)) > 1
  17. The INT signed range is -2147483648 to 2147483647, try a BIGINT instead wth a signed range of -9223372036854775808 to 9223372036854775807
  18. I did not know that a function could be used as a default value. btw, large table!
  19. I suppose so.. but I am not highly experienced with csv import/exporting. Besides, opening the file in vim and executing the regexp: %s/^\([0-9]*\)\s\(.*\)/('\1','\2'),/g then replacing the last comma with a semicolon, appending the entire string with INSERT INTO tbl_name (id, description) VALUES would result in a SQL query that you could execute.
  20. Using joins and aliases something like this: SELECT Message_Text , Author.Username , Receiver.Username FROM Table_2 AS INNER JOIN Table_3 ON Table_2.Message_ID = Table_3.Message_ID INNER JOIN Table_1 AS Author ON Table_3.Author_ID = Author.User_ID INNER JOIN Table_1 AS Receiver ON Table_3.Receiver_ID = Receiver.User_ID
  21. I am not willing to write a quick tutorial on myphpadmin... there is probably information already available somewhere or perhaps someone else here, that actually uses myphpadmin, may be of more use.
  22. Here, this query returns all the duplicates, triplicates... etc SELECT concat(position_x, position_y, position_z) , count(position_x, position_y, position_z) AS duplicate FROM creature GROUP BY concat(position_x, position_y, position_z) HAVING count(position_x, position_y, position_z) > 1
  23. Use a regular expression, something like: SELECT * FROM user WHERE usr_name REGEXP '^[^a-zA-Z]'
  24. Hmm, maybe a select concat(a, c, e) and group by concat(a, c, e) in a subquery returning the id to use in a delete.
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.