Jump to content

[SOLVED] ECHO is slow?


xsaero00

Recommended Posts

Several time a day a page on my website would take a long time to generate. Usually it is a page with lots of content (like a small book.) And I mean very long time. Usually the same page would take at most 3-5 seconds but this time it would take more than 100 seconds. After adding some debugging code that records script time I narrowed it down to one statement This statement is an echo statement that actually outputs the string containing the whole page.

 

On my site to generate a page there is a bunch of inputs and then the whole thing goes through a transformation which produces a single string with the whole page in it. Then that string is outputted using echo and this is where the slowdown happens. As I mentioned it does not happen all the time and I can't reproduce it. I tried replacing echo with print but that did not help. Perhaps this is not even a PHP error.

 

Please help. I searched for related issues and I see that sometimes people have problems echoing long string but there is no solution.

 

 

Link to comment
Share on other sites

You shouldn't be dumping an entire page to one variable and then echoing that.  Look into output buffering.  That should solve all of your problems.  ob_start() would be a good function that should lead you to the rest of them.

Link to comment
Share on other sites

Output buffering sounds good. (I never saw that section of the PHP manual before.) However, it does not seem to make sense for me to use. As I mentioned I use transformation to get final output in one step. Let me elaborate. First I build an extensive XML tree that may contain data from SQL or file or user input. Then I transform that data into HTML using XLST. The transformation is a single function call in PHP and the output of that call get directly dumped to Apache by echo. Would output_buffering help in such situation?

 

Now I am trying to see if breaking sting into multiple chunks using strtok solves my problem. I'll see if this works and then try output buffering.

Link to comment
Share on other sites

your function in php is wack then if it echos it in a single output that is that long

 

No single variable should be that large unless its a file's content (which it is here, but not really)

 

I was actually talking about PHP's functon: transformToXML http://us2.php.net/manual/en/xsltprocessor.transformtoxml.php. I don't think it is "whack". I believe templating is a pretty common approach to generating content on web; it lets separate logic from presentation. We use XSL templates that make our pages look consistent from page to page, plus we are able to do a lot of presentational computations in XSL such as sorting. The output of virtually any template is always a big string unless the templating engine is able to print out as it goes through.

Link to comment
Share on other sites

Very peculiar. But saving a page into a single variable shouldn't cause a problem. Output buffering should be avoidable, and easily so. I've had frameworks that save massive amounts of page output to a single variable and echo them without issue, and so quickly it wasn't worth measuring. How large is this string?

Link to comment
Share on other sites

Output buffering should be avoidable, and easily so.

 

Why? Are you speaking of PHP output buffering. If so then why should it be avoided. I was actually thinking of giving it a try.

 

Splitting string into smaller ones using strtok seem to be doing the trick. I haven't gotten the error since implemented this morning. I'll keep this post updated.

 

 

Link to comment
Share on other sites

Tokenizing the string did not help. It still lags

      global $runtimes;      
      $runtimes['xsl171'] = getExecuteTime();
      //echo $this->Transform( $XSL_File );
      $tok = strtok($this->Transform( $XSL_File ), "\n");
      $runtimes['xsl174'] = getExecuteTime();
      while ($tok !== false) {
            echo $tok."\n";
            $tok = strtok("\n");
      }
      $runtimes['xsl179'] = getExecuteTime();

 

Error:

Warning 
<br/>
<br/>03-04-2009 08:50:01 (PST)  -- (512) Message: Reporting from basepage.php. Extremely long run time of 119.80654001236 seconds for script: /coursecontent.php Times: Array
(
    [xsl171] => 0.70761203765869 
    [xsl174] => 0.94678807258606
    [xsl179] => 119.8064661026
)

 

Times are in seconds.

 

 

Link to comment
Share on other sites

I honestly cannot believe that echo is the slow executing line, no matter how sizeable the value of the variable that is being echoed, especially not in comparison to these other techniques that you're using

  • build an extensive XML tree
  • data from SQL or file
  • transform that data into HTML using XLST

 

If your original complaint was that

echo $this->Transform( $XSL_File );

is slow, what makes you think that it is echo that is slow rather than $this->Transform

 

$runtimes['xsl171'] = getExecuteTime();
$toks $this->Transform( $XSL_File );
$runtimes['xsl172'] = getExecuteTime();
echo $toks;
$runtimes['xsl173'] = getExecuteTime();

 

Link to comment
Share on other sites

it has nothing to do with php

 

all php does is send it out, whether you output buffer, or break the string up.. think of it this way

 

the MORE operations you use, the slower it will run, in this case anyway..

 

if you have the strings

 

just echo them as you get em

 

the less work u do to the string, the less work it will take php, the less time it will take the client to receive the contents..

 

also note, if you have a 200kb image, it might take quite some time to load.. I output alot larger than 200kb sometimes, and it really doesn't do nothing different..

 

all streams, E.G. MP3 streams, Movie streams..

 

they're all output, through functions quite like echo or print

 

readfile() just outputs the data, mostly the same way echo or print would..

 

so its not a problem with echo at all, if anywhere, its elsewhere in the script.

Link to comment
Share on other sites

I honestly cannot believe that echo is the slow executing line, no matter how sizeable the value of the variable that is being echoed, especially not in comparison to these other techniques that you're using

  • build an extensive XML tree
  • data from SQL or file
  • transform that data into HTML using XLST

 

If your original complaint was that

echo $this->Transform( $XSL_File );

is slow, what makes you think that it is echo that is slow rather than $this->Transform

 

You must be kidding. :-) Of course I tried separating

echo $this->Transform( $XSL_File );

into separate line and then measure the time. In addition, I am measuring time inside the transform function. No doubt, transforming the content is one of the longest operations in the code. I know that, but on average it does not take more than quarter of a second. It may get up to about 1.5 seconds for large content. But those are stable, reliables times and I account for it. Its the actual echoing of the content that lags for some reason. Keep in mind it does not happen all the time, only about 20 times a day. But when it happens it effectively locks the particular user out from the site because of the session locking.

 

At this time I strongly believe there is something wrong with my server and hope that reinstall will fix it.

Link to comment
Share on other sites

Reinstallation of a webserver did not help. I used newer versions of the software too. I use packages shipped together with SLES10 SP2 so that would be Apache/2.2.3

, PHP/5.2.5

. I reinstalled the server but I still use the same config files as other servers. I guess I can try tweaking the configs but don't know where to start.

 

Now I am trying to isolate the problem. I created a small PHP script that outputs about 200k of random data and I am timing its execution. I'll see if the problem happens there.

 

The only other thing I can suggest is check the memory usage after each stage, particularly paged memory.

How do you check the paged memory. memory_get_usage()?

 

 

Link to comment
Share on other sites

The only other thing I can suggest is check the memory usage after each stage, particularly paged memory.

How do you check the paged memory. memory_get_usage()?

Nope, memory_get_usage() only shows the total memory currently used by the thread.

To get paged memory usage, you'd need to exec the appropriate *nix command for your process

 

My suggestion is that memory used for the echo variable might be too large for real memory, and being swapped out... having to get it back from disk might be creating overhead

Link to comment
Share on other sites

Okay the reason your getting this delay is due to the Nagle Algorithm and how it

affects TCP/IP transmissions of small packets done separately versus

large packets.

 

Test example:~

Display Length: 14201 characters.

Display Output: 0.00199 seconds.

+50 (no big deal)

Display Length: 14251 characters.

Display Output: 0.00159 seconds.

+50 (WTF!)

Display Length: 14301 characters.

Display Output: 0.12694 seconds.

+50 (etc)

Display Length: 14401 characters.

Display Output: 0.13424 seconds.

 

Now you could write the data to a file and then use readfile() and that will work!..

or break it up into smaller chunks

 

Heres a function to use in replacement of the echo construct

 

<?php
function echobig($string, $bufferSize = 8192)
{
$splitString = str_split($string, $bufferSize);
foreach($splitString as $chunk)
{
	echo $chunk;
}
}
?>

 

<?php
//replace 
echo $data;
//with
echobig($data);
?>

 

Hope that helps

 

MadTechie :)

Link to comment
Share on other sites

I creates a simple script that just outputs a bunch of random data. Here it is.

<?php
register_shutdown_function('check_time');
$begin = microtime(true);
$times = array();
$times[] = "Begin: ".(microtime(true) - $begin)."<br/>"; 
$r[] = rand();
$r[] = rand();
$r[] = rand();
$r[] = rand();
$r[] = rand();
$r[] = rand();
$r[] = rand();

$s = '';
$s1 = '';
for( $j = 0; $j< 11; $j++)
{
    $s1 = $s1.$r[($j % 7)];
}
$times[] = "Begin Generation: ".(microtime(true) - $begin)."<br/>";
for( $i = 0; $i< 3000; $i++)
{    
    $s .= $s1."<br/>";
}
$times[] =  "End Generation: ".(microtime(true) - $begin)."<br/>";
$times[] =  "Begin Output: ".(microtime(true) - $begin)."<br/>";
echo "<html><body>".$s."</body></html>";
$times[] =  "End Output: ".(microtime(true) - $begin)."<br/>";

function check_time()
{
    global $begin;
    global $times;
    if ((microtime(true) - $begin) > 2)
        mail( 'mike.starov@netce.com','IT HAPPENED','TIME TO RUN:'.(microtime(true) - $begin)."\nTIMES: ".print_r($times,true));	
}
?>

 

And I got a lot of outputs like so:

TIME TO RUN:29.461055040359
TIMES: Array
(
    [0] => Begin: 6.9141387939453E-6<br/>
    [1] => Begin Generation: 4.7922134399414E-5<br/>
    [2] => End Generation: 0.0015699863433838<br/>
    [3] => Begin Output: 0.0015778541564941<br/>
    [4] => End Output: 29.461009979248<br/>
)

 

and this is happening on the new freshly installed server. 30 sec to output ~250K of data? This is ridiculous.

 

Link to comment
Share on other sites

Okay the reason your getting this delay is due to the Nagle Algorithm and how it

affects TCP/IP transmissions of small packets done separately versus

large packets.

 

MadTechie :)

 

Thanks you very much. This is a first sensible clue in a long while. I'll try your little function on the live website but in the meantime can you enlighten me more about the Nagle Algorithm (besides what wikipedia has on it) and PHP and if there is a way to turn it off in PHP or system wide. I won't mind turning it off system wide since this is a dedicated webserver.

 

Link to comment
Share on other sites

Now you could write the data to a file and then use readfile() and that will work!..

or break it up into smaller chunks

 

Does readfile() splits data into appropriate chunks automatically? I do use it for outputting PDFs and I think I had the same problem there just not so often as on other pages.

Link to comment
Share on other sites

OKay check the flush settings in php.ini (turn implicit_flush off and increase output_buffering to a higher amount ie 8192 )

 

as for the Nagle Algorithm wouldn't be directly related to the PHP itself but more to do with the apache!..

Sending smaller packets from PHP should "trick" apache into not wait for clients to respond

 

(basically when apaches output buffer is it requires aqk from the client before it continues..)

 

By increasing this your apache kernal will consume more memory but also require less aknoledgments

 

SendBufferSize directive

Syntax:

 

SendBufferSize bytes

Context: server config

Status: core

 

The server will set the TCP buffer size to the number of bytes specified. Very useful to increase past standard OS defaults on high speed high latency (i.e., 100ms or so, such as transcontinental fast pipes)

 

You could also try mod_bw v0.8 (designed for bandwidth control BUT it has some nice options

3.6 - BandWidthPacket

 

    Probably you never need to touch this. It defaults to 8192 which is good

  for almost any speed.

    It must be a size between 1024 and 131072. A Small packet will cause the

  top speed to be lower, and the mod using more time splitting. If you use

  a Size too big, the mod will adjust it to lower speeds.

 

another option maybe to pass some of the memory load to the clients by enabling the compression of HTML ie:

output_handler = ob_gzhandler

 

 

Sorry for the almost random info (was thinking and typing at the some time!)

Link to comment
Share on other sites

So when I have a user with a slow connection the PHP script run times are still going to be high because Apache would wait for aks to come back from client which in turn going to make PHP wait. All this time I was under the impression that PHP just does its thing and hands of the output to Apache and stops and then Apache is responsible for splitting it into chunks and delivering to client.

 

I am still getting errors when splitting the output into chunks of 8192 size. I hade my output buffering set to 4096 though. Now I set it to 8192 and also set SendBufferSize to 8192 and trying it out.

Link to comment
Share on other sites

Still happening with all setting at 8192.

 

Here are times stored into an array.

function echobig($string, $bufferSize = 8192)
{
   global $runtimes;
   $runtimes['xsl289_output'] = array();
   $splitString = str_split($string, $bufferSize);
   foreach($splitString as $chunk)
   {
        $times = array();
$times['start'] = getExecuteTime();
	echo $chunk;
$times['finish'] = getExecuteTime();
$runtimes['xsl289_output'][] = $times;
   }
}

[xsl289_output] => Array
        (
            [0] => Array
                (
                    [start] => 1.7512290477753
                    [finish] => 1.7514650821686
                )

            [1] => Array
                (
                    [start] => 1.7514731884003
                    [finish] => 1.751492023468
                )

            [2] => Array
                (
                    [start] => 1.7514970302582
                    [finish] => 1.751513004303
                )

            [3] => Array
                (
                    [start] => 1.751519203186
                    [finish] => 1.7515351772308
                )

            [4] => Array
                (
                    [start] => 1.7515490055084
                    [finish] => 7.7386651039124  // ???
                )

            [5] => Array
                (
                    [start] => 7.7386870384216
                    [finish] => 9.2678310871124
                )

            [6] => Array
                (
                    [start] => 9.2678511142731
                    [finish] => 9.4918291568756
                )

            [7] => Array
                (
                    [start] => 9.4918510913849
                    [finish] => 11.094405174255
                )

            [8] => Array
                (
                    [start] => 11.094428062439
                    [finish] => 11.094438076019
                )

        )

Link to comment
Share on other sites

Another question: How come PHP does not time out? I have max_execution_time set to 60, but according to measured times the script can be active for more than 100 seconds. Why does not it stop and quit. This causes my users to be locked out (session locking) of the website if they happen to stumble upon "slow echo" problem.

Link to comment
Share on other sites

PHP is handing the data to apache, but you still have to deal with apache buffer, apache will only accept a set amount from php when thats full php must wait for it to be cleared, remember echo is not a php function its a php method!

 

Okay heres a very simple script that may help me explain.

 

Basically this either output to apache or doesn't all depending on ob_start()

 

So with ob_start("callback"); being used PHP doesn't ever get told to send to the browser (well infact doesn't get told to send to apache!)

without it.. well it's like normal..

So your expect the times to be the same.. considering its the same code and all!

 

Well No..Echo doesn't get to apache and thus theirs no delays..

 

i run this on my PC and the times are as below

//Time:6.8108780384064 //Via Apache

//Time:0.067671060562134 //Via PHP ONLY

 

<?php
function callback($buffer)
{
global $echobuffer;
$echobuffer .= $buffer;
return "";
}

function test($WithApache = false,$file)
{
global $echobuffer;
$echobuffer ="";
if(!$WithApache)ob_start("callback"); //Don't pass to apache
$c=0;
$begin = microtime(true);
while($c<10000)
{
	$c++;
	echo rand(1111111111,9999999999)."<br>";
	ob_flush();
	flush();
}
ob_end_flush();
$end = microtime(true);
$time = "Time:".($end-$begin)."<br>\n";
file_put_contents($file,"$time $echobuffer");
echo $time;
}

test(false,"c:/test1.txt");
test(true,"c:/test2.txt");

?>

 

i increased the loop to 100000 and the times are as follows

//timesout (after 30 seconds) //Via Apache

//Time:0.67965197563171 //Via PHP ONLY

 

 

Now if i change the code to do just 1 echo..

ie

function test2($WithApache = false,$file)
{
global $echobuffer;
$echobuffer ="";
if(!$WithApache)ob_start("callback"); //Don't pass to apache
$c=0;
$begin = microtime(true);
$str = "";
while($c<100000)
{
	$c++;
	$str .= rand(1111111111,9999999999)."<br>";
	ob_flush();
	flush();
}
echo $str;
ob_end_flush();
$end = microtime(true);
$time = "Time:".($end-$begin)."<br>\n";
file_put_contents($file,"$time $echobuffer");
echo $time;
}

 

heres the times

Time:0.80782103538513 //Via Apache

Time:0.75758194923401 //Via PHP ONLY

BUT! the page took 44 Seconds to load

So the reason their was no timeout was because PHP had infact finished (BUT apache still had work todo)..

 

Okay well i have to go eat.. i hope that helps explain a little better..

 

But to do believe this is more of a apache issule!

 

--MadTechie :)

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.