Jump to content

Echo benchmark


448191

Recommended Posts

I did some testing on various ways of echoing strings, and came up with some interesseting results.

First, the setting.

I'm testing using loops of 100.000, I do ten of those and take the average. So basicly I'm doing 1.000.000 loops, but I can't do it in one page load because FF freaks out on a document that size.

In every loop I do every test, so you could say I'm running the tests more or less simulaniously, elminating any influence the order of testing might have.

[b]The setup:[/b]
P4 3.0Ghz 512L2 cache
512 MB DDR 400MHz
WinXP SP2
PHP 5.1.4
Apache 2.0.50

[b]The tests:[/b]

[b]simple_single:[/b]

[code]<?php
echo 'Mary had a little lamb, its fleece was white as snow.';
?>[/code]

[b]simple_double:[/b]

[code]<?php
echo "Mary had a little lamb, its fleece was white as snow.";
?>[/code]

[b]con_single:[/b]

[code]<?php
echo $who.' had a '.$size.' lamb, its fleece was '.$color.' as snow.';
?>[/code]

[b]con_double:[/b]

[code]<?php
echo $who." had a ".$size." lamb, its fleece was ".$color." as snow.";
?>[/code]

[b]exp_double:[/b]

[code]<?php
echo "$who had a $size lamb, its fleece was $color as snow.";
?>[/code]

[b]exp_hdoc:[/b]

[code]<?php
echo <<<HDOC
$who had a $size lamb, its fleece was $color as snow.
HDOC;
?>[/code]

[b]arg_single:[/b]

[code]<?php
echo $who,' had a ',$size,' lamb, its fleece was ',$color,' as snow.';
?>[/code]

[b]arg_double:[/b]

[code]<?php
echo $who," had a ",$size," lamb, its fleece was ",$color," as snow.";
?>[/code]


[b]The results:[/b]
[pre]Array
(
    [simple_double] => 1.1737260580063
    [simple_single] => 1.1844212770462
    [con_single] => 1.3709901571274
    [con_double] => 1.4004232883453
    [exp_hdoc] => 1.560631275177
    [exp_double] => 2.1088073730469
    [arg_single] => 2.9638622045517
    [arg_double] => 4.3288885593414
)[/pre]

Before doing 10x100.000, I also did look at individual 1x100.000 results and it always comes out this order, except sometimes, maybe 2 out of 10, 'simple_single' comes out a tiny bit faster than 'simple_double'.

A simple double qouted string coming out faster than a single quoted one surprised the heck out of me, but there is no denying it...  :-\

I only recently found out you can send substrings to echo like arguments. Looking at the result above, I think I'll forget about it very quickly.. :P

This test once again proves the superiour speed of string concatenation over expantion using double quotes. No surprise there.

Heredoc syntax is pretty damn fast. It comes pretty close to concatenation.

[b]The script:[/b]

[code]<?php
session_start();
set_time_limit(18000); //5 min timeout, just in case.

//Set up:
$who = 'Mary';
$color = 'white';
$size = 'little';
$r['con_double'] = 0;
$r['con_single'] = 0;
$r['exp_double'] = 0;
$r['exp_hdoc'] = 0;
$r['arg_single'] = 0;
$r['arg_double'] = 0;
$r['simple_single'] = 0;
$r['simple_double'] = 0;

//Start testing:
for($i=0; $i<100000 ; $i++){

//------------------------//
$startTime = microtime(true);
echo $who." had a ".$size." lamb, its fleece was ".$color." as snow.";
$r['con_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo $who.' had a '.$size.' lamb, its fleece was '.$color.' as snow.';
$r['con_single'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo "$who had a $size lamb, its fleece was $color as snow.";
$r['exp_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo <<<HDOC
$who had a $size lamb, its fleece was $color as snow.
HDOC;
$r['exp_hdoc'] += microtime(true) - $startTime;

//------------------------//
echo $who,' had a ',$size,' lamb, its fleece was ',$color,' as snow.';
$r['arg_single'] += microtime(true) - $startTime;
//------------------------//
echo $who," had a ",$size," lamb, its fleece was ",$color," as snow.";
$r['arg_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo 'Mary had a little lamb, its fleece was white as snow.';
$r['simple_single'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo "Mary had a little lamb, its fleece was white as snow.";
$r['simple_double'] += microtime(true) - $startTime;
}
foreach($r as $test=>$res){
//Add new results
$_SESSION[$test][] = $res;
//Get avg result
$runs = count($_SESSION[$test]);
$avgs[$test] = array_sum($_SESSION[$test])/$runs;
}
echo '<hr/>';
echo 'Averages after run '.$runs.':';
asort($avgs);

echo '<pre>';
print_r($avgs);
echo '</pre>';
//If nr of repeats is below 10:
if($runs<10){
//Refresh in 3 seconds:
echo '<META HTTP-EQUIV="Refresh" CONTENT="3;url='.$_SERVER['PHP_SELF'].'">';
}
?>[/code]
Link to comment
Share on other sites

Motivated by the suggestion that output buffering can influence the speed of echoing, I did another test, this time with output buffering turned on.

I couldn't do 100.000 loops per load, nor 10.000 (4 meg) because buffering this much data seriously declines perfomance (~ -100%). At least on my test server, wich isn't a dedicated server with loads of ram... ::)

So I did 1.000 (424.420 bytes) loops per load, and did 100 of them.

Averages after run 100:

[pre]Array
(
    [simple_double] => 0.0041505265235901
    [simple_single] => 0.0041948556900024
    [con_double] => 0.0060157084465027
    [con_single] => 0.0060297346115112
    [exp_double] => 0.0099934267997742
    [exp_hdoc] => 0.010660362243652
    [arg_single] => 0.016861283779144
    [arg_double] => 0.021891829967499
)[/pre]

To compare performance of output buffering against not, I runned the first script with 100x1000 also:

[pre]Averages after run 100:

Array
(
    [simple_double] => 0.010565533638
    [simple_single] => 0.011855113506317
    [con_single] => 0.012466101646423
    [con_double] => 0.012814438343048
    [exp_hdoc] => 0.015058526992798
    [exp_double] => 0.02208580493927
    [arg_single] => 0.028381476402283
    [arg_double] => 0.04067587852478
)[/pre]

Does that look familiar? Yes it's the first test x10..  :P Well almost exactly.

This also proves that output buffering can cut the time needed for your echoes in half!

Also, when using output buffering, double quotes expansion has moved up one place in the list, it is in this case faster than heredoc.


ob echo test script:

[code]<?php
ob_start();
session_start();
set_time_limit(18000); //5 min timeout, just in case.

//Set up:
$who = 'Mary';
$color = 'white';
$size = 'little';
$r['con_double'] = 0;
$r['con_single'] = 0;
$r['exp_double'] = 0;
$r['exp_hdoc'] = 0;
$r['arg_single'] = 0;
$r['arg_double'] = 0;
$r['simple_single'] = 0;
$r['simple_double'] = 0;

//Start testing:
for($i=0; $i<1000 ; $i++){

//------------------------//
$startTime = microtime(true);
echo $who." had a ".$size." lamb, its fleece was ".$color." as snow.";
$r['con_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo $who.' had a '.$size.' lamb, its fleece was '.$color.' as snow.';
$r['con_single'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo "$who had a $size lamb, its fleece was $color as snow.";
$r['exp_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo <<<HDOC
$who had a $size lamb, its fleece was $color as snow.
HDOC;
$r['exp_hdoc'] += microtime(true) - $startTime;

//------------------------//
echo $who,' had a ',$size,' lamb, its fleece was ',$color,' as snow.';
$r['arg_single'] += microtime(true) - $startTime;
//------------------------//
echo $who," had a ",$size," lamb, its fleece was ",$color," as snow.";
$r['arg_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo 'Mary had a little lamb, its fleece was white as snow.';
$r['simple_single'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo "Mary had a little lamb, its fleece was white as snow.";
$r['simple_double'] += microtime(true) - $startTime;
}
foreach($r as $test=>$res){
//Add new results
$_SESSION[$test][] = $res;
//Get avg result
$runs = count($_SESSION[$test]);
$avgs[$test] = array_sum($_SESSION[$test])/$runs;
}
$l = ob_get_length();
ob_clean();
echo 'Size: '.$l.'<br/>';
echo 'Averages after run '.$runs.':';
asort($avgs);

echo '<pre>';
print_r($avgs);
echo '</pre>';
//If nr of repeats is below 10:
if($runs<100){
//Refresh in 1 second:
echo '<META HTTP-EQUIV="Refresh" CONTENT="1;url='.$_SERVER['PHP_SELF'].'">';
}
ob_end_flush();
?>[/code]
Link to comment
Share on other sites

Triggered by barands test of nesting php in html, I added the following tests to the loop:

[code]
<?php
//------------------------//
$startTime = microtime(true);
?><?php echo $who; ?> had a <?php echo $size; ?> lamb, its fleece was <?php echo $color; ?> as snow. <?php
$r['nesting_reg'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
?><?= $who; ?> had a <?= $size; ?> lamb, its fleece was <?= $color; ?> as snow. <?php
$r['nesting_short'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
?> Mary had a little lamb, its fleece was white as snow. <?php
$r['interupt'] += microtime(true) - $startTime;
?>
[/code]

Results:

[pre]Averages after run 100:

Array
(
    [simple_single] => 0.0069151759147644
    [simple_double] => 0.0070700979232788
    [nesting_reg] => 0.0075448250770569
    [interupt] => 0.0078230977058411
    [con_single] => 0.009110119342804
    [con_double] => 0.0093571543693542
    [exp_hdoc] => 0.016459901332855
    [exp_double] => 0.017165880203247
    [nesting_short] => 0.03273246049881
    [arg_single] => 0.051923093795776
    [arg_double] => 0.060914781093597
)[/pre]

It would seem that nesting php, has an influence on the speed of parsing the of the original test items, similarly (but less) to the effect of output buffering.

The performace rank of the original items stays more or less intact (pure single quotes performs a tiny bit better than double, but the difference is marginal). The difference between heredoc and double quotes is smaller, also like with output buffering.

What surpises me is that regular nesting outperforms concatenation. Also the fact that a single interrupt takes less time than multiple, seems very strange. So I repeated the test.

The results stayed the same:

[pre]Averages after run 100:

Array
(
    [simple_double] => 0.0068991136550903
    [simple_single] => 0.0069989848136902
    [nesting_reg] => 0.0076926040649414
    [interrupt] => 0.0078476023674011
    [con_single] => 0.0088863825798035
    [con_double] => 0.0093597745895386
    [exp_hdoc] => 0.016374745368958
    [exp_double] => 0.017014832496643
    [nesting_short] => 0.032987234592438
    [arg_single] => 0.051187620162964
    [arg_double] => 0.060120923519135
)[/pre]
Link to comment
Share on other sites

Considered that adding bytes to the document served by Apache to the browser takes logic, it would be more fair to divide the time it takes to flush the output buffer over the items tested, if we want a proper estimation of the benefits of ouput buffering.

So I created a new test script, which compensates for the time needed to flush the output buffer.

Output:

[pre]Flush time: 0.0015139579772949
Averages after run 100:

Array
(
    [simple_double] => 0.0043420025706291
    [simple_single] => 0.004475505053997
    [con_single] => 0.0062548300623894
    [con_double] => 0.006329879462719
    [exp_double] => 0.010338027179241
    [exp_hdoc] => 0.010864031016827
    [arg_single] => 0.017105240523815
    [arg_double] => 0.022256915271282
)[/pre]

[u]Crititics eat your hart out, output buffering improves the speed of echoing considerably![/u]


The new script:

[code]
<?php
ob_start();
session_start();
set_time_limit(18000); //5 min timeout, just in case.

//Set up:
$who = 'Mary';
$color = 'white';
$size = 'little';
$r['con_double'] = 0;
$r['con_single'] = 0;
$r['exp_double'] = 0;
$r['exp_hdoc'] = 0;
$r['arg_single'] = 0;
$r['arg_double'] = 0;
$r['simple_single'] = 0;
$r['simple_double'] = 0;

//Start testing:
for($i=0; $i<1000 ; $i++){

//------------------------//
$startTime = microtime(true);
echo $who." had a ".$size." lamb, its fleece was ".$color." as snow.";
$r['con_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo $who.' had a '.$size.' lamb, its fleece was '.$color.' as snow.';
$r['con_single'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo "$who had a $size lamb, its fleece was $color as snow.";
$r['exp_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo <<<HDOC
$who had a $size lamb, its fleece was $color as snow.
HDOC;
$r['exp_hdoc'] += microtime(true) - $startTime;

//------------------------//
echo $who,' had a ',$size,' lamb, its fleece was ',$color,' as snow.';
$r['arg_single'] += microtime(true) - $startTime;
//------------------------//
echo $who," had a ",$size," lamb, its fleece was ",$color," as snow.";
$r['arg_double'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo 'Mary had a little lamb, its fleece was white as snow.';
$r['simple_single'] += microtime(true) - $startTime;

//------------------------//
$startTime = microtime(true);
echo "Mary had a little lamb, its fleece was white as snow.";
$r['simple_double'] += microtime(true) - $startTime;
}
//Compensate for logic of flushing output buffer:
$startTime = microtime(true);
ob_end_flush();
$flushTime = microtime(true) - $startTime;
$flushTimePerItem = $flushTime / count($r);

foreach($r as $test=>$res){
//Add new results
$_SESSION[$test][] = $res + $flushTimePerItem;
//Get avg result
$runs = count($_SESSION[$test]);
$avgs[$test] = array_sum($_SESSION[$test])/$runs;
}
echo 'Flush time: '.$flushTime.'<br/>';
echo 'Averages after run '.$runs.':';
asort($avgs);

echo '<pre>';
print_r($avgs);
echo '</pre>';
//If nr of repeats is below 10:
if($runs<100){
//Refresh in 2 seconds:
echo '<META HTTP-EQUIV="Refresh" CONTENT="2;url='.$_SERVER['PHP_SELF'].'">';
}
?>[/code]
Link to comment
Share on other sites

I optimized the test script.

I was getting some strange results after removing the 'passing as argument' test, which I figured had something to do with the order of tests, as the last item to be tested took a perfomance hit for some reason.

So I modified it to run the tests in random order, within the loop. To be clear, it executes 1000 possible orderings. I guess it would be even better to execute every possible order once, but to be honest I don't have a clue how to script that. If anyone knows a way to do that, let me know.

[pre]Averages after run 100:

Array
(
    [simple_double] => 0.0091080117225647
    [simple_single] => 0.0093175983428955
    [con_single] => 0.010890257358551
    [con_double] => 0.010941817760468
    [exp_hdoc] => 0.017119417190552
    [exp_double] => 0.017233357429504
)[/pre]

The original order STILL remains intact, but test that WORK similarly now all PERFORM even more similarly.

You can easily divide the list into three groups and the difference between items in those groups is very small.

The important thing is that it is confirmed that you should prefer concatenation over exansion, preferably using single quotes. I was already doing that, but one thing I learned from this is that it is ok to use double quotes on strings, as long as you don't put any variables in there. I was really expecting single quotes to outperform double quotes, on all fronts... Not the case, confirmed by a couple of million echoes. :P

testscript:

[code]<?php
session_start();
set_time_limit(18000); //5 min timeout, just in case.

//Set up:
$who = 'Mary';
$color = 'white';
$size = 'little';
$r['con_double'] = 0;
$r['con_single'] = 0;
$r['exp_double'] = 0;
$r['exp_hdoc'] = 0;
$r['simple_single'] = 0;
$r['simple_double'] = 0;
$tests = array_keys($r);
//Start testing:
for($i=0; $i<1000 ; $i++){
//Randomly order tests:
shuffle($tests);
foreach($tests as $test){
switch($test){
case 'con_single';
$startTime = microtime(true);
echo $who.' had a '.$size.' lamb, its fleece was '.$color.' as snow.';
$endTime = microtime(true);
$r['con_single'] += $endTime - $startTime;
break;
case 'con_double';
$startTime = microtime(true);
echo $who." had a ".$size." lamb, its fleece was ".$color." as snow.";
$endTime = microtime(true);
$r['con_double'] += $endTime - $startTime;
break;
case 'exp_double';
$startTime = microtime(true);
echo "$who had a $size lamb, its fleece was $color as snow.";
$endTime = microtime(true);
$r['exp_double'] += $endTime - $startTime;
break;
case 'exp_hdoc';
$startTime = microtime(true);
echo <<<HDOC
$who had a $size lamb, its fleece was $color as snow.
HDOC;
$endTime = microtime(true);
$r['exp_hdoc'] += $endTime - $startTime;
break;
case 'simple_single';
$startTime = microtime(true);
echo 'Mary had a little lamb, its fleece was white as snow.';
$endTime = microtime(true);
$r['simple_single'] += $endTime - $startTime;
break;
case 'simple_double';
$startTime = microtime(true);
echo "Mary had a little lamb, its fleece was white as snow.";
$endTime = microtime(true);
$r['simple_double'] += $endTime - $startTime;
break;
}
}
}
foreach($r as $test=>$res){
//Add new results
$_SESSION[$test][] = $res;
//Get avg result
$runs = count($_SESSION[$test]);
$avgs[$test] = array_sum($_SESSION[$test])/$runs;
}
echo '<hr/>';
echo 'Averages after run '.$runs.':';
asort($avgs);

echo '<pre>';
print_r($avgs);
echo '</pre>';
//If nr of repeats is below 10:
if($runs<100){
//Refresh in 2 seconds:
echo '<META HTTP-EQUIV="Refresh" CONTENT="2;url='.$_SERVER['PHP_SELF'].'">';
}
?>[/code]
Link to comment
Share on other sites

I tried 'print' instead of echo. (3x100x1000)

[pre]
Array
(
    [simple_single] => 0.0090219950675964
    [simple_double] => 0.009040699005127
    [con_single] => 0.010739352703094
    [con_double] => 0.01075662612915
    [exp_hdoc] => 0.017144615650177
    [exp_double] => 0.017181129455566
)
Array
(
    [simple_double] => 0.010165274143219
    [simple_single] => 0.010450003147125
    [con_double] => 0.011748852729797
    [con_single] => 0.012105643749237
    [exp_hdoc] => 0.018087451457977
    [exp_double] => 0.018476240634918
)
Array
(
    [simple_double] => 0.0091826152801514
    [simple_single] => 0.0092255854606628
    [con_single] => 0.010673749446869
    [con_double] => 0.01071010351181
    [exp_hdoc] => 0.017119300365448
    [exp_double] => 0.017387700080872
)[/pre]

Too close to call.
Link to comment
Share on other sites

I'm not convinced that running all of them within the same loops is a completely 'fair' test. All will be hindered/manipulated due to the sharing of memory between all of the functions.

However the results (bar ob_start()) were what I expected, having seen this benchmark performed by countless others.
Link to comment
Share on other sites

[quote author=Jenk link=topic=108402.msg436437#msg436437 date=1158579331]
I'm not convinced that running all of them within the same loops is a completely 'fair' test. All will be hindered/manipulated due to the sharing of memory between all of the functions.

However the results (bar ob_start()) were what I expected, having seen this benchmark performed by countless others.
[/quote]

You might not EVER be convinced (knowing your stubberness), but running them sequentally within the same loop, especially when using random ordering IMPROVES the accuracy. Wether you run each test 1x1000 in the same script (multiplying the effects you noted accuratly by 1000), or run each test in it's own script (not knowing what factors of the environment might have changed by memory leaking or other factors), running the tests with as little time between them as possible will always be more accurate.

You want to compare the tests in the most similar environment possible, so one has to allow for the environment to change as little as possible.

Like I said, only running every possible ordering once (or preferably a couple of times) would produce absolutely accurate averages, so if you know of way to script that, let me know.

As your noting that these result are similar to the results others have published, I won't deny that. I just had to experience first hand, also I haven't come across a benchmark that compares ALL of these types of outputting data in one benchmark.

Also there aren't that many resources on the effects of output buffering on outputting data in terms of speed. I established that output buffering can cut the time needed for outputting in half, even when taking into account flusing the buffer, which takes just over one 10th of the time a regular echo takes.

Mind you that using different hardware and/or different Apache/php versions might produce different results.

Daniel asked for a sum up, so here is my interpetation of the tests performed:

[list]
[*]When not using variables, use double quotes.
[*]When using variables, use single quote concatenation.
[*]Avoid embedding vars and such.
[*]When ouputting a document of typical size, use output buffering, as it can seriously increase performance of outputting.
[/list]
Link to comment
Share on other sites

Really... is there a point to this? If your worried about split second differences in a networked environment its your server and network config thats going to make the difference over wether or not you used single or double quotes. Truely.

If you still need the speed, drop php and start coding your web applications in C.
Link to comment
Share on other sites

[quote author=thorpe link=topic=108402.msg437096#msg437096 date=1158657154]
Really... is there a point to this? If your worried about split second differences in a networked environment its your server and network config thats going to make the difference over wether or not you used single or double quotes. Truely.

If you still need the speed, drop php and start coding your web applications in C.
[/quote]

Yes there is. It's about adapting your coding style to create better performing applications.True that your server and network config (which are part of the environment I talked about) influence the perfomance. I never (and would never) deny that. This is purely about what php coding style should be prefered performance wise. I would suggest doing this bechmark in your publishing enivonment, I doubt the relative performance would differ much.

If I were coding in C, I would be concerned about the perfomance of outputting data in C, but I'm not. I don't see your issues with benchmarking php performance.  ???
Link to comment
Share on other sites

The reason thorpe mentions C is because benchmark results are directly affected by programmer practices, not like in PHP where 80% of everything is done for you. (malloc(), memory pointers and so forth for a start, is all determined for you by the Zend engine)

He also asks "What is the point of this?" because he is absolutely spot on, that it makes absolutely no significant difference.

In answer to 'my stuborness' - no, it's not my stuborness getting in the way. Those factors you mention regarding a fair test, are exactly my 'issues' with running the test in the way you have.

For every indice of the array you have created, there is that little bit extra memory used. After the many thousands of iterations you do, the script will be needing a fair sized chunk of system resources. If you performed the same test, in separate scripts, it would be more fair because the timings are solely on what that chosen method's implications are, not a mix of what all the above + this one have done.
Link to comment
Share on other sites

[quote author=Jenk link=topic=108402.msg437864#msg437864 date=1158753872]
For every indice of the array you have created, there is that little bit extra memory used. After the many thousands of iterations you do, the script will be needing a fair sized chunk of system resources. If you performed the same test, in separate scripts, it would be more fair because the timings are solely on what that chosen method's implications are, not a mix of what all the above + this one have done.
[/quote]

I agree that the tests influence each others performance, exactly why I would prefer the tests to run in every possible ordering once or twice.

I guess you have a point though, it would be more fair to have the effects accumilated. That would be a bigger factor than any slight changes in memory usage (by non relevant processes) that might occur between tests. You got me.  :P
Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.