Jump to content

[SOLVED] Sorting Nightmare! - SORTED :D but need some questions answering [please]


Mardoxx

Recommended Posts

I had a go at this, but can't get it to work AT ALL!! :(

 

Basically I've got a huge list of data from a CSV, separated by new lines: [i've put it here so the main code doesn't get too long]

$input = "67,197,73,203,January,0µs
99,197,105,203,January,1.204119983µs
131,196,137,202,January,4.294091292µs
163,194,169,200,January,9.632959861µs
195,192,201,198,January,17.47425011µs
228,189,234,195,January,28.01344501µs
260,185,266,191,January,41.40980396µs
292,181,298,187,January,57.79775917µs
324,175,330,181,January,77.29364326µs
356,169,362,175,January,100µs
388,161,394,167,January,126.0085149µs
420,153,426,159,January,155.4020994µs
452,144,458,150,January,188.2564265µs
484,133,490,139,January,224.641095µs
516,122,522,128,January,264.6205333µs
549,110,555,116,January,308.2547156µs
581,96,587,102,January,355.5997383µs
613,82,619,88,January,406.7082917µs
645,66,651,72,January,461.6300499µs
677,50,683,56,January,520.4119983µs
67,197,73,203,February,1µs
99,196,105,202,February,4µs
131,194,137,200,February,9µs
163,192,169,198,February,16µs
195,190,201,196,February,25µs
228,187,234,193,February,36µs
260,183,266,189,February,49µs
292,179,298,185,February,64µs
324,174,330,180,February,81µs
356,169,362,175,February,100µs
388,163,394,169,February,121µs
420,156,426,162,February,144µs
452,149,458,155,February,169µs
484,141,490,147,February,196µs
516,133,522,139,February,225µs
549,124,555,130,February,256µs
581,115,587,121,February,289µs
613,105,619,111,February,324µs
645,95,651,101,February,361µs
677,84,683,90,February,400µs
67,197,73,203,March,0.5µs
99,196,105,202,March,2µs
131,196,137,202,March,4.5µs
163,195,169,201,March,8µs
195,193,201,199,March,12.5µs
228,192,234,198,March,18µs
260,190,266,196,March,24.5µs
292,188,298,194,March,32µs
324,186,330,192,March,40.5µs
356,183,362,189,March,50µs
388,180,394,186,March,60.5µs
420,177,426,183,March,72µs
452,173,458,179,March,84.5µs
484,169,490,175,March,98µs
516,165,522,171,March,112.5µs
549,161,555,167,March,128µs
581,156,587,162,March,144.5µs
613,151,619,157,March,162µs
645,146,651,152,March,180.5µs
677,140,683,146,March,200µs";

 

 

I then split the data into an array:

<?php
$array = split("\r\n",$input);  //splits each line of data into a new element in the array $array

/* This next bit splits it into a multidimentional array
    */
foreach ($array as $data) {
$splitted = split(",",$data); //splits each CSV line ($data) into a new element in the array $splitted
$data_array[] = $splitted; // puts the split array data into an element in the array $data_array
}


function mysort($a, $b) {
return strnatcmp($a[0], $b[0]);
} // sorts the array by elements $array[0]

usort($data_array, 'mysort'); //sorts $data_array by the first element
?>

 

This sorts the data by the first CSV element in ascending order which SHOULD order it Jan Feb March Jan Feb March Jan Feb March etc etc etc but it doesn't (for each month the first and third CSV are the same) this COULD be where the start of the problem is, but thinking it through it shouldn't make a difference.

 

 

What I wanted it to do is group the data depending on the SECOND CSV (which is different for nearly each data set) AND if it's within a limit... I had a go at this using a modified version of Crayon Violent`s method here: http://www.phpfreaks.com/forums/index.php/topic,262873.html [thanks for that :D]

 

<?php
$limit = 0; //limit to capture data this should capture all pieces of data where the SECOND CSV is the same 
$p = 0; //set it up, not sure if needed???
foreach ($data_array as $index => $point_array) {
$thediff = ($data_array[$index-1][0]) ? ($point_array[0] - $data_array[$index-1][0]) : 0;
if ($thediff == 0){  // the first CSV values are the same
	$current = $point_array[1]; //the CURRENT second CSV value
	$previous = $data_array[$index-1][1]; // the PREVIOUS second CSV value

	$diff = ($previous) ? ($current - $previous) : 0;

	if ($diff > $limit) $p++;  //the difference is MORE than the limit so put in a NEW array else put in current array 
} else {
	$p++; //the first CSV values are NOT the same so put this in a new array
}
$grouped_array[$p][] = $point_array; 
}
?>

 

so I try this but it doesn't work... because if the limit is 0 the LAST array should only contain ONE array.

If i set the limit to 9999999999 it works fine though. Three arrays per array.

 

 

 

 

NOTE: For debugging (viewing the array) I used this - [it's in a function to make it easier to comment out]

<?php
echo_groups($grouped_array);

function echo_groups($grouped_array) {
foreach ($grouped_array as $index => $group) {
	echo "<b>Group#: $index</b><br />";
	echo nl2br(str_replace(' ',' ',print_r($group,true))); //This echoes the grouped array
	echo "<br /><br />";
}
}
?>

I SORTED IT, just before I posted it...

 

The problem WAS because the data was not sorting jan feb march.. and thus gave NEGATIVE $diff

 

so changing

if ($diff > $limit) $p++;

to

if (abs($diff) > $limit) $p++;

Sorted it! :D

 

Was kind of stupid me not making it the modulus of $diff because seeing as this is going to be working on data that can be above or below the previous one .... it makes sense :P

 

I thought I may as well post it seeing as it (probably) contains some useful stuff.

 

 

If anyone wants to have a look at it and comes up with a way to make it shorter or more efficient, I'd be very grateful!

 

Also can someone explain to me what this means or what it's called so I can find it in the php manual?:

$diff = ($previous) ? ($current - $previous) : 0;

does it mean that if there's an error set it to zero?

ahhh right thanks :D

 

$diff = ($previous) ? ($current - $previous) : 0;

 

so if $previous is all good, do set $diff to ($current - $previous)

if $previous is BAD [in this case it's trying to find an element of an array with a NEGATIVE index :P which doesn't work] then set $diff to 0

 

am I correct?

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.