Jump to content

0o0o0

Members
  • Posts

    88
  • Joined

  • Last visited

Everything posted by 0o0o0

  1. I am using that.. but also with distinct.. and MIN or MAX.. not sure if I completely understand DISTINCT or not.. heres an example.. I want or group by manufacturer... and have the most current date shown but distinct by dates.. which the dates come in from a csv file as text.. then converted how I show it below.. it does convert and the column is entrydate, which is set in sql to DATE $sql = "INSERT INTO TMP SELECT DISTINCT Bike_manufacture, MAX(STR_TO_DATE(entrydate, '%m/%d/%Y')) as entrydate, blah, blah_blah, bleh_bleh_bleh from STREETB group by (Bike_manufacture)"; mysql_query($sql); it converts the date coming in as 08/08/2008 to 2008-08-08 fine.. But the MIN or MAX is only grabbing the lower of the two Bike_manufacture's that are the same cept for the changes in the newer entry... like yamaha 2008-08-08 and yamaha 2008-10-10 it grabs 2008-08-08 everytime.. is it cause sql is not converting the date properly into the Table so sql is grabbing the older one all the time? MAX(STR_TO_DATE(entrydate, '%m/%d/%Y')) cant the above be used on the fly when transfering into the TMP table? to anyone who can help.. its now a 3 week trial and error tedious process. thanks to anyone who can suggest anything to move me on from being stuck this long lol.
  2. I need to take 12/07/2008 in the date column... importing to table and make it 12072008 trying to figure the substring useage for this in a ... SELECT DISTINCT MIN( substring (stripslashes (date) ) ) as date that was an attempt lol not much of a good one. ive tried several all night.. its now 7am I can figure it out. The select distinct min(date) as date .... wont work cause the dates not a stamp on the table but a 12/07/2008. what im doing is going from one table to a temp table.. and trying to figure out how to strip the /'s off the dates in the date column on the fly.. before it goes into the temp table. is it possible thanks?
  3. Im assuming a tmp table is needed so I made that... Ok so ( nfl as example again) .. I made it possible to import csv file into NFL TABLE... then delete all rows with other players... except for brett farve and dan marino and say payton. so now I have NFL TABLE with .. just three rows.. brett farve and dan marino and payton. Let assume ... marino still plays .. but tomorrow he'll retire.. although we want his LAST csv data to stick.. tomorrow cron will run bring the new csv info ... and dan marino isnt in the datafile anymore.. so todays info for marino will stay in the db.. but be his last... so it runs it process... and the NEW payton and farve data came in.. Now... heres the part.. I have todays payton, farve and marino rows.. and also tommorrow in the db.. so now 5 rows.. those 5 rows have timestamps... I want to delete duplicates by the player.. plus by the oldest times of the two... so payton 10:00pm tomorrow payton 10:00pm today drops todays out of the db.. and keeps tomorrows. all the while keeping marinos row cause it is not a duplicate.. and will remain in the db forever as his last.. get me? **** basically how to delete duplicate rows by PLAYER_NAME and the OLDEST of the two timestamps. lol sorry for the long example..
  4. ok I gotta ask one more question and im done for the weekend. how is it when loading a csv file into a db... it overwrites the info in the db? Id like it to just add to the end of whats already in the db. $sql = 'LOAD DATA LOCAL INFILE \'/home/sites/www.mywebsite.com/data/data.csv\' INTO TABLE `AUTO` FIELDS TERMINATED BY \',\' ENCLOSED BY \'"\' ESCAPED BY \'\\\\\' LINES TERMINATED BY \'\\r\\n\''; $result = mysql_query($sql) or die('Query failed: ' . mysql_error()); what am I missing or need to take out so the import will be added to the end of the db as the place to start loading. Thanks.
  5. omg.. your right.. ha ha ! me and you both... my eyes are so bloodshot.. ive been searchin for how to do this for 24+ hours straight. I can sleep now. Thanks alot
  6. im lost... doesnt seem to want to work. but... it works in the sql tester console
  7. k now im gettin peeved.. I went into sql console tester thing in myphpadmin and entered DELETE FROM MYTABLE WHERE player NOT IN('Farve') it ran it.. and succesfully did it.. so I wanted it to show me the exact php code and it gave me this... $sql = 'DELETE FROM MYTABLE WHERE player NOT IN(\'Farve\')'; so i throw that in my php code and run it from my php page.. refilled the db full of data again.. and tried it.. and it will not trim it down to farves.. ( again nfl just example) any idea? <?php include("../lalalalala.php"); $connection = mysql_connect($host,$user,$password) or die ("couldn't connect to server"); $db = mysql_select_db($database,$connection) or die ("Couldn't select database"); $sql = 'DELETE FROM MYTABLE WHERE player NOT IN(\'Farve\')'; mysql_close($connection); echo 'Test'; ?>
  8. lol this forum rocks you definately live up to the name.. I post a question and the answers there in mins! Freaks. how about this ... can you stop duplicates as they come into the db??? or only after by making a new temp table? thanks again.
  9. lol guys are funny... thanks. Will these delete say all the other players rows and leave me with only marino and farves rows left in the db?? correct right? It'll be in a cron script so its doin this every day.. or if it were just once I would do it manually. (its not even a football db.. but using it to explain better what I wanna do. ) WHOA... now even a better question... importing from a cvs..file.. is there a possible way to rip only these two player for the csv before inserting into db? doubt it.. but thought Id ask.
  10. deleting all the info in a database... but leaving selected ones in ... ( repeated everyday) eg.. I used football say.. Delete * from NFL except where player = Brett Farve and player = Dan Marino is there a command like "except where" ??? anyone who can help would be awsome, ive applied to to many forums this week lol. thanks
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.