Jump to content

MasterACE14

Members
  • Posts

    2,687
  • Joined

  • Last visited

Everything posted by MasterACE14

  1. Thanks Barand for the reply. I'm not sure what I'm doing wrong. When I enter the query you provided as is I get: Also yes it would need to be a left join as not every record will have a matching record (although the majority will). I just want to confirm that this is the left join edit you were referring to? SELECT DATE_FORMAT(STR_TO_DATE(msl.date, '%m/%d/%Y %k:%i'), '%e/%m/%Y %k:%i') as time , hr.Value as heartrate , mi.Intensity , msl.Value as minuteSleep , msl.LogId , mst.steps FROM minuteSleep msl LEFT JOIN heartrate hr ON msl.date = hr.Time LEFT JOIN minuteintensities mi ON mi.ActivityMinute = msl.date LEFT JOIN minuteSteps mst ON mst.ActivityMinute = msl.date EDIT: Never mind the above query with the LEFT JOINs worked. Some random character appeared at the very end of the query. It is returning 27,341 records which sounds about right. Thanks you kindly
  2. Hi All, I have 4 tables each with only 2 columns (except for one that has 3) and they all have in common a date/time field. I am trying to export all 4 tables together as one CSV file and to have the records matched together where they have the same date/time. My tables are structured like so (which originally came from 4 individual CSV files): So as an exported CSV I would want the matching records (based on the date/time) to be part of the same record like so: Time, heartrateValue, Intensity, minutesleepValue, minutesleepLogId, steps 3/01/2018 0:01, 45, 0, 1, 17396451215, 0 3/01/2018 0:02, 45, 0, 1, 17396451215, 0 etc. A secondary issue is changing the date format to D/MM/YYYY from M/DD/YYYY. However, if I can get this CSV file export I can probably just loop through it in PHP and fix up the date format. Thank you kindly for your assistance
  3. That's a fair point, I might just import it into a MySQL database and generate it that way. Thanks!
  4. I personally would have opted for a database however my organisation is after a master CSV file that can be generated from the few CSV files they already have (and I'm trying to make it generic enough that more CSVs can be added if needed and a new master CSV file generated). And yes 00:00:00 values would still be recorded.
  5. I am open to ideas on reducing the redundancies. My "test" data may not have been the best example. But the data my organisation has is mostly a bunch of CSV files with 100's of thousands of records mostly in just 2 columns per file. So each file essentially has a timestamp column and a column with some kind of data which might be for step counts, or duration etc. I'm looking to merge them all together on the timepstamps.
  6. array(9) { [1]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "20:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [2]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "18:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [3]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "00:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [4]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "45:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [5]=> array(4) { ["First Name"]=> string(6) "Rachel" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "7-Dec-82" ["Sanity %Age"]=> string(4) "100%" ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "03:00" ["userID"]=> string(1) "3" [""]=> string(0) "" } [6]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [7]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "27:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [8]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "50:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } } One of the userID keys would be removed as it wouldn't make sense to have there twice.
  7. Hi All, I have written some code that currently reads multiple CSV files into a multidimensional array and am now trying to produce a new array where the values for a particular key match across the arrays. Then I would output a new CSV file built from the new array. Currently I have the following code: <?php $file = ''; $path = 'data/'; # CSV file from URL /* if(isset($_GET['filename'])) { */ $files = explode(':', $_GET['filename']); $CSV_Files_Count = count($files); echo $CSV_Files_Count; //$file = 'data/' . strip_tags($_GET['filename'] . '.csv'); echo var_dump($files); $NewArray = array(); # Loop through multiple CSV files for($i = 0; $i < $CSV_Files_Count; $i++) { if(file_exists($path . $files[$i])) { # read CSV file into an associative array $csv = array_map('str_getcsv', file($path . $files[$i])); array_walk($csv, function(&$a) use ($csv) { $a = array_combine($csv[0], $a); }); array_shift($csv); # remove column header $MasterArray[$i] = $csv; /* echo '<pre>'; var_dump($csv); echo '</pre>'; echo '<pre>'; echo $path . $files[$i]; echo '</pre>'; */ /* echo '<pre>'; var_dump($MasterArray[$i]); echo '</pre>'; */ } } ?> If I var_dump($csv) I get the following output which is reading my CSV test files perfectly: array(3) { [0]=> array(7) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["userID"]=> string(1) "1" ["Sanity %Age"]=> string(3) "32%" } [1]=> array(7) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["userID"]=> string(1) "2" ["Sanity %Age"]=> string(3) "95%" } [2]=> array(7) { ["First Name"]=> string(6) "Rachel" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "7-Dec-82" ["userID"]=> string(1) "3" ["Sanity %Age"]=> string(4) "100%" } } array(9) { [0]=> array(4) { ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [1]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "20:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [2]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "18:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [3]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "00:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [4]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "45:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [5]=> array(4) { ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "03:00" ["userID"]=> string(1) "3" [""]=> string(0) "" } [6]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [7]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "27:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [8]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "50:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } } So I want to create a master array which I would then output as a new CSV file. Where elements from the second array are appended onto the first array where the userID's match and if there are elements without any matching userID's they still remain in the array. Thank you kindly in advance
  8. Living in Australia I've been with Ausweb for my domains since 2006 and never had an issue. http://domains.ausweb.com.au/ If you're after something more local, I have no idea lol
  9. $query = "SELECT * FROM A_record WHERE A_dte <= '{$dte90}'";
  10. yes but only 1 set, not 2, if you look at your source you have 2 sets of double quotes: <img src=""http://auntievics.com/assets/WayahsWoofburgers.jpg"" /> copy paste that line from PFMaBiSmAd and it should work.
  11. use that, you've added in extra double quotes around $row['product_pic'] that you don't need.
  12. ah I've got it working with your INNER JOIN suggestion. Made a really stupid mistake, the last 2 records in my sample result set above had the wrong poster_id. My apologies. Thanks for your help! Cheers, Ace
  13. you've put in parenthesis around product_id instead of curly braces. should be: echo"<tr><td width=\"400px\"><img src="{$row['product_id']}" />{$row['product_title']}.<br />{$row['product_Description']}.<br />{$row['product_price']}</td></tr>";
  14. need to add to the query 'FROM table_name', you currently have incorrect syntax. SELECT Or this method is much better if that is all you're trying to achieve.
  15. SELECT DATE_ADD(NOW(), INTERVAL 90 DAY AS variablename
  16. if they were correct, your queries would be working. Have you tried running your queries in PHPMyAdmin?
  17. okay your last loop should be: echo "<a href=\"/{$i}\">{$i}</a> "; your line 29 should be: $LimitValue = $page * $LIMIT - ($LIMIT); You have a character there that isn't what it should be, I'm not even sure how you get that character lol, but it's acting like a hyphen(string) as opposed to a minus sign. and this has an extra curly brace in it... if ($page <= 0) { $page = 1; } } else { $page = 1; } should be if ($page <= 0) { $page = 1; } else { $page = 1; }
  18. I don't think one of the(maybe more) values in your WHERE clause are what they should be. Echo them out.
  19. echo"<tr><td width=\"400px\"><img src="{$row['product_pic']}" />.
  20. I can't see anything else wrong with it, is that the entire file? And can you copy/paste the full error message.
  21. is a MySQL error coming up?
  22. I won't do your homework for you, but I can point you to the relevant information. - You will obviously need to know how to create a HTML form which has a 'dropdown' box, 'hidden' inputs and a 'submit' button. - You will also need to know how to use Sessions in PHP to store guesses and attempts. - Here's the Rand() function if you're curious. - You may want to use a GET variable for your 'start over' link. - You will also need to know how either If/Else conditional statements work or Switch statements to check for correct/incorrect guesses. - Lastly, probably could use a While loop, to loop through the guesses. Good Luck!
  23. I'm not sure what's happened here, but it isn't right: if ($page == ceil($NumOfPages) && $page != 1) { should be if ($page == ceil($NumOfPages) && $page != 1) {
  24. this... $db_password = $row['password']+$row['key']; should be this... $db_password = $row['password'] . $row['key']; combining the strings, rather than trying to 'add' them.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.