Jump to content

Merge 2 arrays where the values of a particular key match


MasterACE14

Recommended Posts

Hi All,

I have written some code that currently reads multiple CSV files into a multidimensional array and am now trying to produce a new array where the values for a particular key match across the arrays. Then I would output a new CSV file built from the new array.

Currently I have the following code:

    <?php
            $file = '';
            $path = 'data/';
        # CSV file from URL
      /*  if(isset($_GET['filename']))
        { */

            $files = explode(':', $_GET['filename']);

            $CSV_Files_Count = count($files);

            echo $CSV_Files_Count;
            //$file = 'data/' . strip_tags($_GET['filename'] . '.csv');

            echo var_dump($files);

            $NewArray = array();

            # Loop through multiple CSV files
            for($i = 0; $i < $CSV_Files_Count; $i++)
            {

                if(file_exists($path . $files[$i]))
                {
                    # read CSV file into an associative array
                    $csv = array_map('str_getcsv', file($path . $files[$i]));
                    array_walk($csv, function(&$a) use ($csv) {
                      $a = array_combine($csv[0], $a);
                    });
                    array_shift($csv); # remove column header

                    $MasterArray[$i] = $csv;

/*
                    echo '<pre>';
                    var_dump($csv);
                    echo '</pre>';

                    echo '<pre>';
                    echo $path . $files[$i];
                    echo '</pre>';

                    */

/*
                    echo '<pre>';
                    var_dump($MasterArray[$i]);
                    echo '</pre>';
                    */

                }

            }
?>

If I var_dump($csv) I get the following output which is reading my CSV test files perfectly:

array(3) {
  [0]=>
  array(7) {
    ["First Name"]=>
    string(4) "Mark"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "M"
    ["Date of Birth"]=>
    string(9) "19-Dec-60"
    ["userID"]=>
    string(1) "1"
    ["Sanity %Age"]=>
    string(3) "32%"
  }
  [1]=>
  array(7) {
    ["First Name"]=>
    string(4) "Toni"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "F"
    ["Date of Birth"]=>
    string(8) "2-May-78"
    ["userID"]=>
    string(1) "2"
    ["Sanity %Age"]=>
    string(3) "95%"
  }
  [2]=>
  array(7) {
    ["First Name"]=>
    string(6) "Rachel"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "F"
    ["Date of Birth"]=>
    string(8) "7-Dec-82"
    ["userID"]=>
    string(1) "3"
    ["Sanity %Age"]=>
    string(4) "100%"
  }
}
array(9) {
  [0]=>
  array(4) {
    ["Location"]=>
    string(8) "bathroom"
    ["duration"]=>
    string(5) "34:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
  [1]=>
  array(4) {
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "20:00"
    ["userID"]=>
    string(1) "2"
    [""]=>
    string(0) ""
  }
  [2]=>
  array(4) {
    ["Location"]=>
    string(5) "study"
    ["duration"]=>
    string(5) "18:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
  [3]=>
  array(4) {
    ["Location"]=>
    string(5) "study"
    ["duration"]=>
    string(5) "00:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
  [4]=>
  array(4) {
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "45:00"
    ["userID"]=>
    string(1) "2"
    [""]=>
    string(0) ""
  }
  [5]=>
  array(4) {
    ["Location"]=>
    string(8) "bathroom"
    ["duration"]=>
    string(5) "03:00"
    ["userID"]=>
    string(1) "3"
    [""]=>
    string(0) ""
  }
  [6]=>
  array(4) {
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "34:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
  [7]=>
  array(4) {
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "27:00"
    ["userID"]=>
    string(1) "2"
    [""]=>
    string(0) ""
  }
  [8]=>
  array(4) {
    ["Location"]=>
    string(5) "study"
    ["duration"]=>
    string(5) "50:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
}

So I want to create a master array which I would then output as a new CSV file. Where elements from the second array are appended onto the first array where the userID's match and if there are elements without any matching userID's they still remain in the array.

Thank you kindly in advance

 

 

Link to comment
Share on other sites

array(9) {
  [1]=>
  array(4) {
    ["First Name"]=>
    string(4) "Toni"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "F"
    ["Date of Birth"]=>
    string(8) "2-May-78"
    ["Sanity %Age"]=>
    string(3) "95%"
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "20:00"
    ["userID"]=>
    string(1) "2"
    [""]=>
    string(0) ""
  }
  [2]=>
  array(4) {
    ["First Name"]=>
    string(4) "Mark"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "M"
    ["Date of Birth"]=>
    string(9) "19-Dec-60"
    ["Sanity %Age"]=>
    string(3) "32%"
    ["Location"]=>
    string(5) "study"
    ["duration"]=>
    string(5) "18:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
  [3]=>
  array(4) {
    ["First Name"]=>
    string(4) "Mark"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "M"
    ["Date of Birth"]=>
    string(9) "19-Dec-60"
    ["Sanity %Age"]=>
    string(3) "32%"
    ["Location"]=>
    string(5) "study"
    ["duration"]=>
    string(5) "00:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
  [4]=>
  array(4) {
    ["First Name"]=>
    string(4) "Toni"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "F"
    ["Date of Birth"]=>
    string(8) "2-May-78"
    ["Sanity %Age"]=>
    string(3) "95%"
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "45:00"
    ["userID"]=>
    string(1) "2"
    [""]=>
    string(0) ""
  }
  [5]=>
  array(4) {
    ["First Name"]=>
    string(6) "Rachel"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "F"
    ["Date of Birth"]=>
    string(8) "7-Dec-82"
    ["Sanity %Age"]=>
    string(4) "100%"
    ["Location"]=>
    string(8) "bathroom"
    ["duration"]=>
    string(5) "03:00"
    ["userID"]=>
    string(1) "3"
    [""]=>
    string(0) ""
  }
  [6]=>
  array(4) {
    ["First Name"]=>
    string(4) "Mark"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "M"
    ["Date of Birth"]=>
    string(9) "19-Dec-60"
    ["Sanity %Age"]=>
    string(3) "32%"
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "34:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
  [7]=>
  array(4) {
    ["First Name"]=>
    string(4) "Toni"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "F"
    ["Date of Birth"]=>
    string(8) "2-May-78"
    ["Sanity %Age"]=>
    string(3) "95%"
    ["Location"]=>
    string(7) "kitchen"
    ["duration"]=>
    string(5) "27:00"
    ["userID"]=>
    string(1) "2"
    [""]=>
    string(0) ""
  }
  [8]=>
  array(4) {
    ["First Name"]=>
    string(4) "Mark"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "M"
    ["Date of Birth"]=>
    string(9) "19-Dec-60"
    ["Sanity %Age"]=>
    string(3) "32%"
    ["Location"]=>
    string(5) "study"
    ["duration"]=>
    string(5) "50:00"
    ["userID"]=>
    string(1) "1"
    [""]=>
    string(0) ""
  }
}

One of the userID keys would be removed as it wouldn't make sense to have there twice.

Link to comment
Share on other sites

Now I am even more confused.

I thought your aim was to get a master array containing one record for each id. Each record would comprise the data from the various CSVs.

However, your required output that you just posted contains multiple, near identical, records for each id.

I have regrouped them for comparison - the durations are different so which is correct? If they are all correct and required, why would you want to duplicate all the matching data?

array(9) {
  [1]=>                           [4]=>                          [7]=>
  array(4) {                      array(4) {                     array(4) {
    ["First Name"]=>                ["First Name"]=>               ["First Name"]=>
    string(4) "Toni"                string(4) "Toni"               string(4) "Toni"
    ["Last Name"]=>                 ["Last Name"]=>                ["Last Name"]=>
    string(5) "Baker"               string(5) "Baker"              string(5) "Baker"
    ["Nationality"]=>               ["Nationality"]=>              ["Nationality"]=>
    string(7) "British"             string(7) "British"            string(7) "British"
    ["Gender"]=>                    ["Gender"]=>                   ["Gender"]=>
    string(1) "F"                   string(1) "F"                  string(1) "F"
    ["Date of Birth"]=>             ["Date of Birth"]=>            ["Date of Birth"]=>
    string(8) "2-May-78"            string(8) "2-May-78"           string(8) "2-May-78"
    ["Sanity %Age"]=>               ["Sanity %Age"]=>              ["Sanity %Age"]=>
    string(3) "95%"                 string(3) "95%"                string(3) "95%"
    ["Location"]=>                  ["Location"]=>                 ["Location"]=>
    string(7) "kitchen"             string(7) "kitchen"            string(7) "kitchen"
    ["duration"]=>                  ["duration"]=>                 ["duration"]=>
    string(5) "20:00"               string(5) "45:00"              string(5) "27:00"
    ["userID"]=>                    ["userID"]=>                   ["userID"]=>
    string(1) "2"                   string(1) "2"                  string(1) "2"
    [""]=>                          [""]=>                         [""]=>
    string(0) ""                    string(0) ""                   string(0) ""
  }                               }                              }

  [2]=>                           [3]=>                         [8]=>                           [6]=>
  array(4) {                      array(4) {                    array(4) {                      array(4) {
    ["First Name"]=>                ["First Name"]=>              ["First Name"]=>                ["First Name"]=>
    string(4) "Mark"                string(4) "Mark"              string(4) "Mark"                string(4) "Mark"
    ["Last Name"]=>                 ["Last Name"]=>               ["Last Name"]=>                 ["Last Name"]=>
    string(5) "Baker"               string(5) "Baker"             string(5) "Baker"               string(5) "Baker"
    ["Nationality"]=>               ["Nationality"]=>             ["Nationality"]=>               ["Nationality"]=>
    string(7) "British"             string(7) "British"           string(7) "British"             string(7) "British"
    ["Gender"]=>                    ["Gender"]=>                  ["Gender"]=>                    ["Gender"]=>
    string(1) "M"                   string(1) "M"                 string(1) "M"                   string(1) "M"
    ["Date of Birth"]=>             ["Date of Birth"]=>           ["Date of Birth"]=>             ["Date of Birth"]=>
    string(9) "19-Dec-60"           string(9) "19-Dec-60"         string(9) "19-Dec-60"           string(9) "19-Dec-60"
    ["Sanity %Age"]=>               ["Sanity %Age"]=>             ["Sanity %Age"]=>               ["Sanity %Age"]=>
    string(3) "32%"                 string(3) "32%"               string(3) "32%"                 string(3) "32%"
    ["Location"]=>                  ["Location"]=>                ["Location"]=>                  ["Location"]=>
    string(5) "study"               string(5) "study"             string(5) "study"               string(7) "kitchen"
    ["duration"]=>                  ["duration"]=>                ["duration"]=>                  ["duration"]=>
    string(5) "18:00"               string(5) "00:00"             string(5) "50:00"               string(5) "34:00"
    ["userID"]=>                    ["userID"]=>                  ["userID"]=>                    ["userID"]=>
    string(1) "1"                   string(1) "1"                 string(1) "1"                   string(1) "1"
    [""]=>                          [""]=>                        [""]=>                          [""]=>
    string(0) ""                    string(0) ""                  string(0) ""                    string(0) ""
  }                               }                             }                               }

  [5]=>
  array(4) {
    ["First Name"]=>
    string(6) "Rachel"
    ["Last Name"]=>
    string(5) "Baker"
    ["Nationality"]=>
    string(7) "British"
    ["Gender"]=>
    string(1) "F"
    ["Date of Birth"]=>
    string(8) "7-Dec-82"
    ["Sanity %Age"]=>
    string(4) "100%"
    ["Location"]=>
    string(8) "bathroom"
    ["duration"]=>
    string(5) "03:00"
    ["userID"]=>
    string(1) "3"
    [""]=>
    string(0) ""
  }
}

 

Link to comment
Share on other sites

I am open to ideas on reducing the redundancies. My "test" data may not have been the best example. But the data my organisation has is mostly a bunch of CSV files with 100's of thousands of records mostly in just 2 columns per file. So each file essentially has a timestamp column and a column with some kind of data which might be for step counts, or duration etc. I'm looking to merge them all together on the timepstamps.

Link to comment
Share on other sites

Have you considered a database?

If you normalize the data then all duplications (except for key data) are removed

user                                                                                                location
+-----------+------------+-----------+-------------+--------+------------+-------------+            +-------------+-------------+
| user_id   | first_name | last_name | nationality | gender | dob        | sanity_agepc|            | location_id | loc_name    |
+-----------+------------+-----------+-------------+--------+------------+-------------+            +-------------+-------------+
|    1      | Mark       | Baker     | British     |   M    | 1960-12-19 |   32        |            |     1       | bathroom    |
|    2      | Toni       | Baker     | British     |   F    | 1978-05-02 |   32        |            |     2       | kitchen     |
|    3      | Rachel     | Baker     | British     |   F    | 1982-12-07 |  100        |            |     3       | study       |
+-----------+------------+-----------+-------------+--------+------------+-------------+            +-------------+-------------+
     |                                                                                                    |
     |                                                                                                    |
     +----------------------------------------------------------------------+            +----------------+
                                                                            |            |
                                                                            |            |
                                                        user_location       |            |
                                                        +-------------+------------+-------------+-----------+
                                                        | user_loc_id |  user_id   | location    | duration  |
                                                        +-------------+------------+-------------+-----------+
                                                        |      1      |      1     |     1       |  00:34:00 |
                                                        |      2      |      2     |     2       |  00:20:00 |
                                                        |      3      |      1     |     3       |  00:18:00 |
                                                        |      4      |      1     |     3       |  00:00:00 |<-- would you record this?
                                                        |      5      |      2     |     2       |  00:45:00 |
                                                        |      6      |      3     |     1       |  00:30:00 |
                                                        |      7      |      1     |     2       |  00:34:00 |
                                                        |      8      |      2     |     2       |  00:27:00 |
                                                        |      9      |      1     |     3       |  00:50:00 |
                                                        +-------------+------------+-------------+-----------+

 

Edited by Barand
Link to comment
Share on other sites

I personally would have opted for a database however my organisation is after a master CSV file that can be generated from the few CSV files they already have (and I'm trying to make it generic enough that more CSVs can be added if needed and a new master CSV file generated).

And yes 00:00:00 values would still be recorded.

Edited by MasterACE14
i missed replying to the previous post comment in the table
Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.