Jump to content

MasterACE14

Members
  • Content Count

    2,687
  • Joined

  • Last visited

Community Reputation

0 Neutral

About MasterACE14

  • Rank
    Prolific Member
  • Birthday 07/21/1992

Profile Information

  • Gender
    Male
  • Location
    Sydney, Australia
  1. Thanks Barand for the reply. I'm not sure what I'm doing wrong. When I enter the query you provided as is I get: Also yes it would need to be a left join as not every record will have a matching record (although the majority will). I just want to confirm that this is the left join edit you were referring to? SELECT DATE_FORMAT(STR_TO_DATE(msl.date, '%m/%d/%Y %k:%i'), '%e/%m/%Y %k:%i') as time , hr.Value as heartrate , mi.Intensity , msl.Value as minuteSleep , msl.LogId , mst.steps FROM minuteSleep msl LEFT JOIN heartrate hr ON msl.date = hr.Time LEFT JOIN minuteintensities mi ON mi.ActivityMinute = msl.date LEFT JOIN minuteSteps mst ON mst.ActivityMinute = msl.date EDIT: Never mind the above query with the LEFT JOINs worked. Some random character appeared at the very end of the query. It is returning 27,341 records which sounds about right. Thanks you kindly
  2. Hi All, I have 4 tables each with only 2 columns (except for one that has 3) and they all have in common a date/time field. I am trying to export all 4 tables together as one CSV file and to have the records matched together where they have the same date/time. My tables are structured like so (which originally came from 4 individual CSV files): So as an exported CSV I would want the matching records (based on the date/time) to be part of the same record like so: Time, heartrateValue, Intensity, minutesleepValue, minutesleepLogId, steps 3/01/2018 0:01, 45, 0, 1, 17396451215, 0 3/01/2018 0:02, 45, 0, 1, 17396451215, 0 etc. A secondary issue is changing the date format to D/MM/YYYY from M/DD/YYYY. However, if I can get this CSV file export I can probably just loop through it in PHP and fix up the date format. Thank you kindly for your assistance
  3. That's a fair point, I might just import it into a MySQL database and generate it that way. Thanks!
  4. I personally would have opted for a database however my organisation is after a master CSV file that can be generated from the few CSV files they already have (and I'm trying to make it generic enough that more CSVs can be added if needed and a new master CSV file generated). And yes 00:00:00 values would still be recorded.
  5. I am open to ideas on reducing the redundancies. My "test" data may not have been the best example. But the data my organisation has is mostly a bunch of CSV files with 100's of thousands of records mostly in just 2 columns per file. So each file essentially has a timestamp column and a column with some kind of data which might be for step counts, or duration etc. I'm looking to merge them all together on the timepstamps.
  6. array(9) { [1]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "20:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [2]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "18:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [3]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "00:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [4]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "45:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [5]=> array(4) { ["First Name"]=> string(6) "Rachel" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "7-Dec-82" ["Sanity %Age"]=> string(4) "100%" ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "03:00" ["userID"]=> string(1) "3" [""]=> string(0) "" } [6]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [7]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "27:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [8]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "50:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } } One of the userID keys would be removed as it wouldn't make sense to have there twice.
  7. Hi All, I have written some code that currently reads multiple CSV files into a multidimensional array and am now trying to produce a new array where the values for a particular key match across the arrays. Then I would output a new CSV file built from the new array. Currently I have the following code: <?php $file = ''; $path = 'data/'; # CSV file from URL /* if(isset($_GET['filename'])) { */ $files = explode(':', $_GET['filename']); $CSV_Files_Count = count($files); echo $CSV_Files_Count; //$file = 'data/' . strip_tags($_GET['filename'] . '.csv'); echo var_dump($files); $NewArray = array(); # Loop through multiple CSV files for($i = 0; $i < $CSV_Files_Count; $i++) { if(file_exists($path . $files[$i])) { # read CSV file into an associative array $csv = array_map('str_getcsv', file($path . $files[$i])); array_walk($csv, function(&$a) use ($csv) { $a = array_combine($csv[0], $a); }); array_shift($csv); # remove column header $MasterArray[$i] = $csv; /* echo '<pre>'; var_dump($csv); echo '</pre>'; echo '<pre>'; echo $path . $files[$i]; echo '</pre>'; */ /* echo '<pre>'; var_dump($MasterArray[$i]); echo '</pre>'; */ } } ?> If I var_dump($csv) I get the following output which is reading my CSV test files perfectly: array(3) { [0]=> array(7) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["userID"]=> string(1) "1" ["Sanity %Age"]=> string(3) "32%" } [1]=> array(7) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["userID"]=> string(1) "2" ["Sanity %Age"]=> string(3) "95%" } [2]=> array(7) { ["First Name"]=> string(6) "Rachel" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "7-Dec-82" ["userID"]=> string(1) "3" ["Sanity %Age"]=> string(4) "100%" } } array(9) { [0]=> array(4) { ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [1]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "20:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [2]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "18:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [3]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "00:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [4]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "45:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [5]=> array(4) { ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "03:00" ["userID"]=> string(1) "3" [""]=> string(0) "" } [6]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [7]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "27:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [8]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "50:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } } So I want to create a master array which I would then output as a new CSV file. Where elements from the second array are appended onto the first array where the userID's match and if there are elements without any matching userID's they still remain in the array. Thank you kindly in advance
  8. Living in Australia I've been with Ausweb for my domains since 2006 and never had an issue. http://domains.ausweb.com.au/ If you're after something more local, I have no idea lol
  9. I've heard nothing but horror stories about them. They were great, until they booted me off for 'running PHP text games'... apparently that's in their terms of service.
  10. www.x10hosting.com are fantastic.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.