Dareros Posted November 16, 2014 Share Posted November 16, 2014 Hi; why does this snippet of code doesn't measure the time required to include the file in milliseconds. The problem is unknown for me because the variable $diff always have the wrong value. Here is the code : <?php $start = microtime(true); include('/path/to/my/script.php'); $end = microtime(true); $diff = ($end-$start)*1000; echo 'The script took ' . $diff . 'ms to proceed.'; ?> The problem is that the variable $diff return a big number instead of a couple of milliseconds, for example it return a value such as 1416110398494 which is illogical why ? Regards Quote Link to comment Share on other sites More sharing options...
requinix Posted November 16, 2014 Share Posted November 16, 2014 Did you check the actual values of $start and $end? Quote Link to comment Share on other sites More sharing options...
Dareros Posted November 16, 2014 Author Share Posted November 16, 2014 Yes, strange behavior! $start return : 1416165131.3015 $end return : 1416165131.4172 $diff return : 1416165131417.2 How that can happen ? Quote Link to comment Share on other sites More sharing options...
Ch0cu3r Posted November 16, 2014 Share Posted November 16, 2014 Is the code you posted the actual code you are testing? You'd only get that result if $end is being multiplied by 1000 and not $diff. Quote Link to comment Share on other sites More sharing options...
Barand Posted November 16, 2014 Share Posted November 16, 2014 ... or your $start contains 0 (have you misspelled $start in your code). Given your start and end, it should produce $start = 1416165131.3015; $end = 1416165131.4172; $diff = ($end - $start)*1000; echo $diff . ' ms'; // 115.70000648499 ms Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.