Jump to content

MasterACE14

Members
  • Posts

    2,687
  • Joined

  • Last visited

Everything posted by MasterACE14

  1. Thanks Barand for the reply. I'm not sure what I'm doing wrong. When I enter the query you provided as is I get: Also yes it would need to be a left join as not every record will have a matching record (although the majority will). I just want to confirm that this is the left join edit you were referring to? SELECT DATE_FORMAT(STR_TO_DATE(msl.date, '%m/%d/%Y %k:%i'), '%e/%m/%Y %k:%i') as time , hr.Value as heartrate , mi.Intensity , msl.Value as minuteSleep , msl.LogId , mst.steps FROM minuteSleep msl LEFT JOIN heartrate hr ON msl.date = hr.Time LEFT JOIN minuteintensities mi ON mi.ActivityMinute = msl.date LEFT JOIN minuteSteps mst ON mst.ActivityMinute = msl.date EDIT: Never mind the above query with the LEFT JOINs worked. Some random character appeared at the very end of the query. It is returning 27,341 records which sounds about right. Thanks you kindly
  2. Hi All, I have 4 tables each with only 2 columns (except for one that has 3) and they all have in common a date/time field. I am trying to export all 4 tables together as one CSV file and to have the records matched together where they have the same date/time. My tables are structured like so (which originally came from 4 individual CSV files): So as an exported CSV I would want the matching records (based on the date/time) to be part of the same record like so: Time, heartrateValue, Intensity, minutesleepValue, minutesleepLogId, steps 3/01/2018 0:01, 45, 0, 1, 17396451215, 0 3/01/2018 0:02, 45, 0, 1, 17396451215, 0 etc. A secondary issue is changing the date format to D/MM/YYYY from M/DD/YYYY. However, if I can get this CSV file export I can probably just loop through it in PHP and fix up the date format. Thank you kindly for your assistance
  3. That's a fair point, I might just import it into a MySQL database and generate it that way. Thanks!
  4. I personally would have opted for a database however my organisation is after a master CSV file that can be generated from the few CSV files they already have (and I'm trying to make it generic enough that more CSVs can be added if needed and a new master CSV file generated). And yes 00:00:00 values would still be recorded.
  5. I am open to ideas on reducing the redundancies. My "test" data may not have been the best example. But the data my organisation has is mostly a bunch of CSV files with 100's of thousands of records mostly in just 2 columns per file. So each file essentially has a timestamp column and a column with some kind of data which might be for step counts, or duration etc. I'm looking to merge them all together on the timepstamps.
  6. array(9) { [1]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "20:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [2]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "18:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [3]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "00:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [4]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "45:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [5]=> array(4) { ["First Name"]=> string(6) "Rachel" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "7-Dec-82" ["Sanity %Age"]=> string(4) "100%" ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "03:00" ["userID"]=> string(1) "3" [""]=> string(0) "" } [6]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [7]=> array(4) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["Sanity %Age"]=> string(3) "95%" ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "27:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [8]=> array(4) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["Sanity %Age"]=> string(3) "32%" ["Location"]=> string(5) "study" ["duration"]=> string(5) "50:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } } One of the userID keys would be removed as it wouldn't make sense to have there twice.
  7. Hi All, I have written some code that currently reads multiple CSV files into a multidimensional array and am now trying to produce a new array where the values for a particular key match across the arrays. Then I would output a new CSV file built from the new array. Currently I have the following code: <?php $file = ''; $path = 'data/'; # CSV file from URL /* if(isset($_GET['filename'])) { */ $files = explode(':', $_GET['filename']); $CSV_Files_Count = count($files); echo $CSV_Files_Count; //$file = 'data/' . strip_tags($_GET['filename'] . '.csv'); echo var_dump($files); $NewArray = array(); # Loop through multiple CSV files for($i = 0; $i < $CSV_Files_Count; $i++) { if(file_exists($path . $files[$i])) { # read CSV file into an associative array $csv = array_map('str_getcsv', file($path . $files[$i])); array_walk($csv, function(&$a) use ($csv) { $a = array_combine($csv[0], $a); }); array_shift($csv); # remove column header $MasterArray[$i] = $csv; /* echo '<pre>'; var_dump($csv); echo '</pre>'; echo '<pre>'; echo $path . $files[$i]; echo '</pre>'; */ /* echo '<pre>'; var_dump($MasterArray[$i]); echo '</pre>'; */ } } ?> If I var_dump($csv) I get the following output which is reading my CSV test files perfectly: array(3) { [0]=> array(7) { ["First Name"]=> string(4) "Mark" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "M" ["Date of Birth"]=> string(9) "19-Dec-60" ["userID"]=> string(1) "1" ["Sanity %Age"]=> string(3) "32%" } [1]=> array(7) { ["First Name"]=> string(4) "Toni" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "2-May-78" ["userID"]=> string(1) "2" ["Sanity %Age"]=> string(3) "95%" } [2]=> array(7) { ["First Name"]=> string(6) "Rachel" ["Last Name"]=> string(5) "Baker" ["Nationality"]=> string(7) "British" ["Gender"]=> string(1) "F" ["Date of Birth"]=> string(8) "7-Dec-82" ["userID"]=> string(1) "3" ["Sanity %Age"]=> string(4) "100%" } } array(9) { [0]=> array(4) { ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [1]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "20:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [2]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "18:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [3]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "00:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [4]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "45:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [5]=> array(4) { ["Location"]=> string(8) "bathroom" ["duration"]=> string(5) "03:00" ["userID"]=> string(1) "3" [""]=> string(0) "" } [6]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "34:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } [7]=> array(4) { ["Location"]=> string(7) "kitchen" ["duration"]=> string(5) "27:00" ["userID"]=> string(1) "2" [""]=> string(0) "" } [8]=> array(4) { ["Location"]=> string(5) "study" ["duration"]=> string(5) "50:00" ["userID"]=> string(1) "1" [""]=> string(0) "" } } So I want to create a master array which I would then output as a new CSV file. Where elements from the second array are appended onto the first array where the userID's match and if there are elements without any matching userID's they still remain in the array. Thank you kindly in advance
  8. Good Day Everyone, I recently was looking for a decent framework to build a new website on and at a glance Yii Framework (http://www.yiiframework.com/) seemed to suit my needs. But after starting to work with I noticed it is perhaps a little too user friendly. I am now tossing up between CodeIgniter (http://ellislab.com/codeigniter) and PHPDevShell (http://www.phpdevshell.org/). If anyone has any prior experience with either frameworks I would love to hear your thoughts. While Yii generates a lot of code for you, providing a base to work off. I have always preferred building functionality myself in conjunction with a framework that doesn't provide a base for you. Any thoughts, ideas are welcome. Kind Regards, Ace
  9. I'm trying to validate a string so it can only be alphanumeric characters 0-9, a-z, A-Z and the following characters " ", "-", "!", "(", ")", "/", ".", "," Looking at the php manual and a few different tutorial sites I'm not having much luck, I can get the alphanumeric part, but not those other characters. I currently have... $string = "FN#*F#)@)"; // should be invalid if(!preg_match('/^[a-zA-Z0-9]+$/i', $string)) echo "Invalid Input"; Not sure where to go from here. Any help is very much appreciated. Kind Regards, Ace
  10. If you need to wear glasses at the SAME time as you have contact lenses in, then you don't have the right contact lense strength.
  11. There is of course, plenty of free material online. Like the one on this very site... http://www.phpfreaks.com/tutorial/oo-php-part-1-oop-in-full-effect That way you save some coin.
  12. you could of course, make sure you have everything nice and secure so it is unlikely(but not impossible) that you will be hacked in the first place. As for DB admins viewing stuff they shouldn't... if they can't be trusted, why would they be a DB admin in the first place? My two cents. -Ace
  13. should add some error checking, rather than assuming that the query will ALWAYS return one or more results. if(mysql_num_rows($result) > 0) { while ($row = mysql_fetch_assoc($result)) { echo '<option value="'.$row["username"].'">'.$row["username"].'</option>'; } } else { echo '<option>No Usernames</option>'; } EDIT: It's probably not displaying any usernames because either the query isn't working as expected, or you have no records in that table.
  14. wouldn't really say there's a 'proper' way. More a matter of preference. What you have there is fine.
  15. this would be a good place to start: http://au2.php.net/manual/en/function.imagecreate.php
  16. quick question, how do you block w3schools from coming up in search results? browser extension? Google search settings?
  17. note that it's not only variables that can be passed as arguments, can also pass arrays, objects etc And obviously you don't HAVE to pass a variable, take the following for example... function some_function($text) { return $text; } $message = some_function("my random text"); echo $message; // Output: my random text
  18. $config = new JConfig; echo $config->editor;
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.