
asingualrity
New Members-
Posts
2 -
Joined
-
Last visited
asingualrity's Achievements

Newbie (1/5)
0
Reputation
-
I didn't see where to edit my post. I forgot to include the output of $testArray: It is the format that i am looking for so that i can print to a csv file. Array ( [Shared-net-name] => Array ( [Pool Size] => 13 [Max] => 9 [Min] => 9 [Average] => 9 ) [Shared-net-name] => Array ( [Pool Size] => 1777 [Max] => 1429 [Min] => 102 [Average] => 1023 ) ) Done.
-
I think I started off a little bit bigger than I should have.... uh oh... Anyway, I am querying an OpenTSDB instance for DHCP pool usage on shared networks. That part is easy. I am running one query that pulls the datapoints over 365 days. I need to get the min/max/average for each of 30, 180, and 365 days. What I would like to do is run one query for 365 days and then return the min/max for each time period from the same array, if that makes sense. The array contains the unix timestamp and the number of addresses used at that point in time. I have tried pulling the data into a separate array using array_walk but i'm stumped. For some reason i'm having a really hard time with this, and i'm hoping someone can walk me through it, or at the very least point me at some really good examples, becuase my google skills aren't doing all that well and I have failed to adapt anything that I have found. I know this is ugly. Very ugly. I promise I will clean it up for production. My current code: <?php #error_reporting(E_ERROR | E_PARSE); $query_url="http://xxx.xxx.xxx.xx:xxxx/api/query"; $dhcp_hosts=array("xxxxxx.xxxxxx.xxx.xxx"); //"xxxxxx.xxxxxx.xxx.xxx"); $SharedNets=array(); $pool_usage=array(); $time_start = microtime(true); $opts = array('http' => array( 'ignore_errors' => true, //'timeout' => 20, 'header' => 'Connection:close\r\n' ) ); $context = stream_context_create($opts); if (($handle = fopen("http://xxxx.xxxx.xxxx.xxx/grafana_data/dhcp/SharedNetworks.csv", "r")) !== FALSE) { while (($data = fgetcsv($handle)) !== FALSE) { $SharedNets[$data[1]][]=substr($data[0],0,stripos($data[0],'/')); } fclose($handle); } unset($SharedNets['Share Name']); $SharedNets=array_keys($SharedNets); sort($SharedNets); foreach ($SharedNets as $net) { foreach ($dhcp_hosts as $host) { try { $retries = 1; while ($retries > 0) { $query = $query_url . "?start=5m-ago&" . "m=sum:bluecat.dhcp.shared.pool{metric=poolUsed,index=" . $net . ",host=" . $host . "}&". "m=sum:bluecat.dhcp.shared.pool{metric=poolSize,index=" . $net . ",host=" . $host . "}"; $response = json_decode(file_get_contents($query, false, $context), true); if (isset($response) && !array_key_exists("error", $response)) { $used = array_pop($response[0]['dps']); $size = array_pop($response[1]['dps']); } $query_time = microtime(true); foreach ($dhcp_hosts as $host) { try { $retries = 1; while ($retries > 0) { $query = $query_url . "?start=5d-ago&" . //"m=min:1d-min:bluecat.dhcp.shared.pool{metric=poolUsed,index=" . $net . ",host=" . $host . "}&" . "m=sum:1d-max:bluecat.dhcp.shared.pool{metric=poolUsed,index=" . $net . ",host=" . $host . "}"; $response = json_decode(file_get_contents($query, false, $context), true); if (isset($response) && !array_key_exists("error", $response)) { foreach ($response as $index => $responseEntry) { $response[$index]['dps'] = array_filter($responseEntry['dps']); } $min = min($response[0]['dps']); $max = max($response[0]['dps']); $sum1 = array_sum($response[0]['dps']); $arrayTot = count($response[0]['dps']); $total = $sum1 / $arrayTot; //count($response[0]['dps']); $avg = round($total); if (empty($min)) { $min = 0; } if (empty($max)) { $max = 0; } foreach ($SharedNets as $Key) { if ($Key == $net) { $testArray[$Key]["Pool Size"]=$size; $testArray[$Key]["Max"]=$max; $testArray[$Key]["Min"]=$min; $testArray[$Key]["Average"]= ceil($avg); } } $retries = 0; continue; } } } catch (Exception $e) {} } } } catch (Exception $e) {} } } //print_r($testArray); $outputFile = fopen ('/home/brian/ArrayCSVtest.csv', 'w'); //$testArray2 = array_map("utf8_decode", $testArray); //print_r($testArray); $jsonTest = json_encode($testArray); //$json_obj = json_decode($jsonTest,); //foreach ($json_obj as $row) //{ // fputcsv($outputFile, $row); //} // Generate Array. $array = [ $testArray ]; // Clear Output Buffer ob_clean(); header("Content-type: text/x-csv"); header("Content-Transfer-Encoding: binary"); header("Content-Disposition: attachment; filename=csv".date('YmdHis',strtotime('now')).".csv"); header("Pragma: no-cache"); header("Expires: 0"); // Generate CSV in Memory $file = fopen('php://temp/maxmemory:'. (12*1024*1024), 'r+'); // 128mb // Write CSV to memory fputcsv($file, array_keys(call_user_func_array('array_merge', $array))); fputcsv($outputFile, array_keys(call_user_func_array('array_merge', $array)),',', ' '); foreach($testArray as $row) { fputcsv($file, $row); fwrite($outputFile, $row. "\r\n"); fputcsv($outputFile, $row. "\r\n"); } // Fetch CSV contents rewind($file); $output = stream_get_contents($file); //fwrite($outputFile, $output); fclose($file); // Echo CSV echo $output; //die(); //fputcsv($outputFile, $row); echo "Query execution time: " . round(microtime(true) - $query_time, 2) . " seconds\n\n"; echo "Cumulative execution time: " . round((microtime(true) - $time_start) / 60, 2) . " minutes\n\n"; echo "//////////////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n"; fclose($outputFile); This is what the $response array returns, edited for brevity (thousands of datapoints seemed excessive) and for identifying data of course....: XXXXXXXXXXXXXXXXXXX Addresses Currently in use: 9 pool size: 13 Array ( [0] => Array ( [metric] => XXXXXXXXXXXXXXXXXXXXX [tags] => Array ( [index] => XXXXXXXXXXXXXXXXXX [host] => XXXXXXXXXXXXXXXXXX [metric] => poolUsed ) [aggregateTags] => Array ( ) [dps] => Array ( [1428016023] => 9 [1428016143] => 9 [1428016264] => 9 [1428016384] => 9 [1428016503] => 9 [1428016624] => 9 [1428016744] => 9 [1428016863] => 9 [1428016983] => 9 [1428017104] => 9 [1428017224] => 9 [1428017344] => 9 [1428017464] => 9 [1428017583] => 9 [1428017703] => 9 [1428017824] => 9 [1428017943] => 9 [1428018063] => 9 [1428018184] => 9 [1428018329] => 9 [1428018423] => 9 [1428018544] => 9 [1428018663] => 9 [1428018783] => 9 [1428018904] => 9 [1428019023] => 9 [1428019143] => 9 [1428019263] => 9 [1428019383] => 9 [1428019503] => 9 [1428019624] => 9 [1428019743] => 9 ) ) ) [1] => Array ( [metric] => XXXXXXXXXXXXXXXXXXXXX [tags] => Array ( [index] => XXXXXXXXXXXXXXXXXX [host] => XXXXXXXXXXXXXXXXXX [metric] => poolUsed ) [aggregateTags] => Array ( ) [dps] => Array ( [1428016023] => 9 [1428016143] => 9 [1428016264] => 9 [1428016384] => 9 [1428016503] => 9 [1428016624] => 9 [1428016744] => 9 [1428016863] => 9 [1428016983] => 9 [1428017104] => 9 [1428017224] => 9 [1428017344] => 9 [1428017464] => 9 [1428017583] => 9 [1428017703] => 9 [1428017824] => 9 [1428017943] => 9 [1428018063] => 9 [1428018184] => 9 [1428018329] => 9 [1428018423] => 9 [1428018544] => 9 [1428018663] => 9 [1428018783] => 9 [1428018904] => 9 [1428019023] => 9 [1428019143] => 9 [1428019263] => 9 [1428019383] => 9 [1428019503] => 9 [1428019624] => 9 [1428019743] => 9 ) ) ) 365 days: Min used: 9 Max used: 9