-
Posts
19 -
Joined
-
Last visited
Never
Profile Information
-
Gender
Not Telling
xsaero00's Achievements

Newbie (1/5)
0
Reputation
-
I believe I found a solution. PHP output buffers are stackable and it seems that the last buffer in a stack is bound to the PHP process and will halt the execution of the script when flushing. Any additional buffers do not seem to behave this way. <?php ini_set('output_buffering', 1); //ob_start(); $str = ''; $c=0; while ($c < 200000) { $c++; $str .= rand(1111111111, 9999999999) . "<br>\n"; } $begin = microtime(true); echo "<!--".$str."-->"; ob_end_flush(); echo 'Time: '.(microtime(true) - $begin)."\n<br/>"; ?> This code will output something like Time: 5.2357218265533 but as soon as you uncomment ob_start() the output will be Time: 0.037539958953857 The page is still going to take about 6 seconds to load but PHP will be done by then. This is what was important for me. I understand that there is no real way to speed up transfer of large files over slow networks but I want PHP to quit as soon as it is done generating content. When ob_start() is not there the caching is still done but it is bound to PHP process. When ob_start() is called a new buffer gets opened that is independent of PHP process. Hopefully this will help someone. Perhaps this was obvious to someone but I spent a week hunting this down. Thanks for all the help.
-
I did not try you scripts from CLI, but I tried them on my dev webserver. My results were inconsistent with yours: For test1 I got: Time:1.8898890018463 // Do not output to Apache Time:10.101974010468 // Output through Apache For test2 I got: Time:1.9051520824432 // Do not output to Apache Time:32.477254152298 // Output through Apache Perhaps you have PHP deployed differently with Apache but on my installation whenever PHP starts to output data (echo or ob_end_flush()) to Apache it locks up until Apache is done. I have php installed as mod_php in Apache.
-
With the setup mentioned in the previous post I am still getting the long run times but now it is around ob_end_flush() as expected. $runtimes['before_flush'] = getExecuteTime(); ob_end_flush(); $runtimes['after_flush'] = getExecuteTime(); Result: [before_flush] => 0.9801812171936 [after_flush] => 16.093120098114 Are you sure that PHP had infact finished and quit? Wasn't it still running, waiting for apache to take care of the data in buffer? At least this is a case for me. The workaround for me now is to explicitly release all locks in shutdown function before flushing. The better solution would be to setup a proxy or use lingerd. Thanks for all the help. PS. Its official ECHO is the slowest PHP statement! (not really )
-
Thanks for that last post MadTechie. I was suspecting this is how it works but just was not sure. So this is what I am going to try now. [*]Increase PHP buffering. Set output_buffering = On [*]Remove SendBufferSize sirective from Apache, hence leaving with system default which is 108544 (cat /proc/sys/net/core/wmem_default ) [*]Output string as whole. Do not split in chunks. I don't think Nagle Algorithm is a problem here [*]In a shutdown function call ob_end_flush(). This step is probably optional since PHP flushes the buffer implicitly at the end of the execution. I understand that this will not get rid of slow downs and probably will use more memory, but the plus here is that PHP process will finish and release any locks. As for memory usage it probably going to be less using PHP buffering than implementing a proxy/staging server that would accept all output and then spoon feed it to the client.
-
Another question: How come PHP does not time out? I have max_execution_time set to 60, but according to measured times the script can be active for more than 100 seconds. Why does not it stop and quit. This causes my users to be locked out (session locking) of the website if they happen to stumble upon "slow echo" problem.
-
Still happening with all setting at 8192. Here are times stored into an array. function echobig($string, $bufferSize = 8192) { global $runtimes; $runtimes['xsl289_output'] = array(); $splitString = str_split($string, $bufferSize); foreach($splitString as $chunk) { $times = array(); $times['start'] = getExecuteTime(); echo $chunk; $times['finish'] = getExecuteTime(); $runtimes['xsl289_output'][] = $times; } } [xsl289_output] => Array ( [0] => Array ( [start] => 1.7512290477753 [finish] => 1.7514650821686 ) [1] => Array ( [start] => 1.7514731884003 [finish] => 1.751492023468 ) [2] => Array ( [start] => 1.7514970302582 [finish] => 1.751513004303 ) [3] => Array ( [start] => 1.751519203186 [finish] => 1.7515351772308 ) [4] => Array ( [start] => 1.7515490055084 [finish] => 7.7386651039124 // ??? ) [5] => Array ( [start] => 7.7386870384216 [finish] => 9.2678310871124 ) [6] => Array ( [start] => 9.2678511142731 [finish] => 9.4918291568756 ) [7] => Array ( [start] => 9.4918510913849 [finish] => 11.094405174255 ) [8] => Array ( [start] => 11.094428062439 [finish] => 11.094438076019 ) )
-
So when I have a user with a slow connection the PHP script run times are still going to be high because Apache would wait for aks to come back from client which in turn going to make PHP wait. All this time I was under the impression that PHP just does its thing and hands of the output to Apache and stops and then Apache is responsible for splitting it into chunks and delivering to client. I am still getting errors when splitting the output into chunks of 8192 size. I hade my output buffering set to 4096 though. Now I set it to 8192 and also set SendBufferSize to 8192 and trying it out.
-
Does readfile() splits data into appropriate chunks automatically? I do use it for outputting PDFs and I think I had the same problem there just not so often as on other pages.
-
Thanks you very much. This is a first sensible clue in a long while. I'll try your little function on the live website but in the meantime can you enlighten me more about the Nagle Algorithm (besides what wikipedia has on it) and PHP and if there is a way to turn it off in PHP or system wide. I won't mind turning it off system wide since this is a dedicated webserver.
-
I creates a simple script that just outputs a bunch of random data. Here it is. <?php register_shutdown_function('check_time'); $begin = microtime(true); $times = array(); $times[] = "Begin: ".(microtime(true) - $begin)."<br/>"; $r[] = rand(); $r[] = rand(); $r[] = rand(); $r[] = rand(); $r[] = rand(); $r[] = rand(); $r[] = rand(); $s = ''; $s1 = ''; for( $j = 0; $j< 11; $j++) { $s1 = $s1.$r[($j % 7)]; } $times[] = "Begin Generation: ".(microtime(true) - $begin)."<br/>"; for( $i = 0; $i< 3000; $i++) { $s .= $s1."<br/>"; } $times[] = "End Generation: ".(microtime(true) - $begin)."<br/>"; $times[] = "Begin Output: ".(microtime(true) - $begin)."<br/>"; echo "<html><body>".$s."</body></html>"; $times[] = "End Output: ".(microtime(true) - $begin)."<br/>"; function check_time() { global $begin; global $times; if ((microtime(true) - $begin) > 2) mail( '[email protected]','IT HAPPENED','TIME TO RUN:'.(microtime(true) - $begin)."\nTIMES: ".print_r($times,true)); } ?> And I got a lot of outputs like so: TIME TO RUN:29.461055040359 TIMES: Array ( [0] => Begin: 6.9141387939453E-6<br/> [1] => Begin Generation: 4.7922134399414E-5<br/> [2] => End Generation: 0.0015699863433838<br/> [3] => Begin Output: 0.0015778541564941<br/> [4] => End Output: 29.461009979248<br/> ) and this is happening on the new freshly installed server. 30 sec to output ~250K of data? This is ridiculous.
-
Reinstallation of a webserver did not help. I used newer versions of the software too. I use packages shipped together with SLES10 SP2 so that would be Apache/2.2.3 , PHP/5.2.5 . I reinstalled the server but I still use the same config files as other servers. I guess I can try tweaking the configs but don't know where to start. Now I am trying to isolate the problem. I created a small PHP script that outputs about 200k of random data and I am timing its execution. I'll see if the problem happens there. How do you check the paged memory. memory_get_usage()?
-
You must be kidding. :-) Of course I tried separating echo $this->Transform( $XSL_File ); into separate line and then measure the time. In addition, I am measuring time inside the transform function. No doubt, transforming the content is one of the longest operations in the code. I know that, but on average it does not take more than quarter of a second. It may get up to about 1.5 seconds for large content. But those are stable, reliables times and I account for it. Its the actual echoing of the content that lags for some reason. Keep in mind it does not happen all the time, only about 20 times a day. But when it happens it effectively locks the particular user out from the site because of the session locking. At this time I strongly believe there is something wrong with my server and hope that reinstall will fix it.
-
Tries calling ob_end_flush() before the output to disable buffering but it still does not help. I am stumped. There is nothing left to do but reinstall the whole server.
-
Tokenizing the string did not help. It still lags global $runtimes; $runtimes['xsl171'] = getExecuteTime(); //echo $this->Transform( $XSL_File ); $tok = strtok($this->Transform( $XSL_File ), "\n"); $runtimes['xsl174'] = getExecuteTime(); while ($tok !== false) { echo $tok."\n"; $tok = strtok("\n"); } $runtimes['xsl179'] = getExecuteTime(); Error: Warning <br/> <br/>03-04-2009 08:50:01 (PST) -- (512) Message: Reporting from basepage.php. Extremely long run time of 119.80654001236 seconds for script: /coursecontent.php Times: Array ( [xsl171] => 0.70761203765869 [xsl174] => 0.94678807258606 [xsl179] => 119.8064661026 ) Times are in seconds.
-
Why? Are you speaking of PHP output buffering. If so then why should it be avoided. I was actually thinking of giving it a try. Splitting string into smaller ones using strtok seem to be doing the trick. I haven't gotten the error since implemented this morning. I'll keep this post updated.