NotionCommotion Posted December 24, 2016 Share Posted December 24, 2016 (edited) Any recommendations how to check if a variable is a two element unassociated array other than the following? function check($a) { return is_array($a) && count($a)===2 && isset($a[0],$a[1]); } check([1,4]); check(['a','b']); check([null,1]); Edited December 24, 2016 by NotionCommotion Quote Link to comment Share on other sites More sharing options...
kicken Posted December 24, 2016 Share Posted December 24, 2016 Why do you need such a specific check? Quote Link to comment Share on other sites More sharing options...
bsmither Posted December 25, 2016 Share Posted December 25, 2016 Not a recommendation, but another way to skin the cat: function check($a, (int)$z) { $return = is_array($a) && (count($a)==$z); // so far so good if($return) foreach($a as $k => $v) $return = $return && (is_int($k)); // first non-integer key fails the whole thing // can also use a while loop to iterate the array and stop at first false return $return; } I think your test arrays will always be non-associative as you are using a short syntax listing only values. Also, there is the possibility that the integer indices will not be zero and one. Quote Link to comment Share on other sites More sharing options...
NotionCommotion Posted December 25, 2016 Author Share Posted December 25, 2016 (edited) Why do you need such a specific check? Validation of data received via JSON. Edited December 25, 2016 by NotionCommotion Quote Link to comment Share on other sites More sharing options...
Psycho Posted December 27, 2016 Share Posted December 27, 2016 Validation of data received via JSON. What is the source of the data? Why are you concerned that the returned data may have indexes other than 0 and 1? And, if it does get returned with indexes other than 0/1 is there a possibility it is correct, yet the process changed to send back different indexes? Why would you want your application to break if that happens? Why aren't you more concerned with the values in the array? FYI: There is no reason to use the strict comparison for the array length count($a)===2. It's impossible for that function to return a string of "2". Just check that the the value is a two-dimensional array and force the indexes. Then, more importantly, validate the values in the array before you use them. Plus, you can make the length configurable so you can re-purpose, if needed. function checkArray($arr, $len=2) { if(!is_array($arr) || count($arr)!=$len) { return false; } //Validate the values . . . } Quote Link to comment Share on other sites More sharing options...
NotionCommotion Posted December 27, 2016 Author Share Posted December 27, 2016 What is the source of the data? Why are you concerned that the returned data may have indexes other than 0 and 1? And, if it does get returned with indexes other than 0/1 is there a possibility it is correct, yet the process changed to send back different indexes? Why would you want your application to break if that happens? Why aren't you more concerned with the values in the array? It is just a homemade JSON protocol transferred via sockets between two machines which I both control. I wish to limit the amount of network traffic by reducing the JSON string size. I plan on transferring something like the following, and need to retrieve the three values in the d arrays and interpret them based on their position in the array. I'm also concerned with their values, but need to first understand their meaning by their position. Make sense? Thanks [ {"t": 123321, "d": [123.41,4113.231,45234.123]}, {"t": 123324, "d": [143.41,4213.231,44234.123]}, {"t": 123326, "d": [142.41,4413.231,42234.123]}, {"t": 123329, "d": [153.41,4313.231,43234.123]} ] Quote Link to comment Share on other sites More sharing options...
Jacques1 Posted December 27, 2016 Share Posted December 27, 2016 It is just a homemade JSON protocol Looks like this is the real problem of all your recent threads. I wish to limit the amount of network traffic by reducing the JSON string size. Of course. Ever heard of compression? BSON? Or even crazier: Measuring to see if the problem is real or just imaginary? Quote Link to comment Share on other sites More sharing options...
NotionCommotion Posted December 27, 2016 Author Share Posted December 27, 2016 Ever heard of compression? BSON? Or even crazier: Measuring to see if the problem is real or just imaginary? No, I've never dealt with BSON. It seems to have been originally developed for MongoDB, but maybe has applications outside of it? http://php.net/manual/en/function.bson-decode.php seems to be a bit premature. Quote Link to comment Share on other sites More sharing options...
Jacques1 Posted December 27, 2016 Share Posted December 27, 2016 Hence the measuring part. Real optimization almost always comes at a price, which is why it's done for a specific goal, not because the programmer feels like it. So before you jump to any techniques, figure out the actual requirements and their priorities. Quote Link to comment Share on other sites More sharing options...
maxxd Posted December 28, 2016 Share Posted December 28, 2016 Also, if this is a home-spun solution, is there a reason you've determined that the best way to understand the meaning of the 'd' array values is by position instead of named index? I know it's done a lot, but if you're creating this yourself why not make it a bit easier on everyone now and in the future by using logically named indexes? Quote Link to comment Share on other sites More sharing options...
NotionCommotion Posted December 28, 2016 Author Share Posted December 28, 2016 Originally, I used indexes, and my JSON looked like: [ {"timestamp":123321,"data":{"value":133.3,"units":"lbs"}}, {"timestamp":123324,"data":{"value":123.2,"units":"lbs"}}, {"timestamp":123327,"data":{"value":113.4,"units":"lbs"}}, .... {"timestamp":124329,"data":{"value":153.1,"units":"lbs"}} ] It just seemed like a lot of extra fluff when I could instead do: [ {"t":123321,"d":[133.3,"lbs"]}, {"t":123324,"d":[123.2,"lbs"]}, {"t":123327,"d":[113.4,"lbs"]}, .... {"t":124329,"d":[153.1,"lbs"]} ] Maybe, it isn't an issue, and I might as well improve the readability and go back to the first approach. But then again, if user requirements change in the future, maybe it would be nice to have to deal with less data. Quote Link to comment Share on other sites More sharing options...
Jacques1 Posted December 28, 2016 Share Posted December 28, 2016 You really, really need to develop a more systematic and professional approach to problem solving. This is way too much randomly-fiddling-with-stuff-and-hoping-it-somehow-helps. Is there a problem right now? Not in some hypothetical future. Now. If there isn't, stop those nano-optimization attempts that don't do anything but keep your busy for yet another week. [...] I might as well improve the readability and go back to the first approach. That's the right solution for the entire thread. But then again, if user requirements change in the future, maybe it would be nice to have to deal with less data. If the user requirements change in the future, you need to systematically identify the problems and solve those specific problems with the right tools. Adding random “optimizations” just-in-case not only makes no sense. It's objectively counter-productive. By trying to solve your imaginary traffic problems, you've created real problems: Validation has become so obscure that you need multiple posts to even explain it. The data format is close to unreadable. And worst of all, you're wasting your time instead of spending it on useful features, code quality etc. If there was a real problem and a specific goal (e. g. “My customer is sending 10 MiB of JSON data per second and needs to reduce traffic by 50%”), I would happily discuss all kinds of compression methods, alternative formats, custom protocols and whatnot to help you reach your 50% goal. But there's neither a problem nor a goal. It's all just made up. Quote Link to comment Share on other sites More sharing options...
Psycho Posted December 28, 2016 Share Posted December 28, 2016 I would also add to Jacques last post that readability of code is not a trivial concern. Using good practices when writing code saves many, many hours of wasted hours in varying ways. From least to most beneficial, I would rank it as follows: 1. When actively writing new code you don't have to try and "think up" new acronyms to use for variable/function names that don't conflict with others that may already exist that you can't remember. If I need to define a variable for the first day of the current month for a reporting module, using $fom is a bad choice, whereas $firstDayOfCurrentMonth is a much better choice. The amount of extra disk space or execution is too small to even consider. 2. If you need to revise or debug existing code you will spend an inordinate amount of time trying to understand what you wrote before because the names are not intuitive. If you have a problem with a function with parameters such as $ac, $fuj, $tor, you will have to go back to where the function is called to understand what the values are that are passed to the function - which may require you to go back even further to assess the logic on how the values were derived. 3. Most importantly, when anyone else has to review/revise your code they won't spend hours/days/weeks trying to figure things out. Heck, if you are properly naming and using comments you will be able to copy/paste a section of code into this forum with a description of what your problem is and what you want to achieve and get a good answer quickly. I.e. no need for multiple back-and-forth posts to try and figure out what the heck your code is trying to do. Lastly, don't be afraid to use comments liberally. If some code you write was not very simple and took some thought to develop, chances are you won't remember how or why you ended up with the final code. If for example, you have to fix a bug where a division by zero error occurs - then spell it out where you implement that fix. 1 Quote Link to comment Share on other sites More sharing options...
NotionCommotion Posted December 28, 2016 Author Share Posted December 28, 2016 I give, I give Thanks guys, I will go back to a more sensible approach. I actually do have a problem with the amount of content. Trying to make it marginally smaller, however, is surely only a band-aid and not a fix. Quote Link to comment Share on other sites More sharing options...
Jacques1 Posted December 28, 2016 Share Posted December 28, 2016 I actually do have a problem with the amount of content. What problem? Quote Link to comment Share on other sites More sharing options...
NotionCommotion Posted December 28, 2016 Author Share Posted December 28, 2016 What problem? A little off topic, so started a separate post. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.