etrader Posted May 18, 2011 Share Posted May 18, 2011 I have a simple php script without mysql database. Normally it takes less than 1s to complete the process. But in high traffic (more than 10 concurrent processes), the execution time dramatically increases to 10-30s. This itself causes an increase in the concurrent processes. I think this behavior is not very common. Any idea? Btw, can huge error logs make the system slow? I have numerous php notices and it's growing. Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/ Share on other sites More sharing options...
pornophobic Posted May 18, 2011 Share Posted May 18, 2011 Disk space isn't generally relative to system speed. If you have tons of errors in your scripts and you aren't fixing them, you could be slowing down the system because it's so busy logging errors. Other things that you could consider are the quality of your host, what type of hosting is it? Free? Shared? VPS? Dedicated? Cloud? Are there lots of includes or requires in the script? Does the script do a lot of file writing? etc. Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217241 Share on other sites More sharing options...
etrader Posted May 18, 2011 Author Share Posted May 18, 2011 Thanks for your attention. I tried on three different VPSs running CentOS 5.5 and Virtualmin. I checked with different php codes, the main delay appears when it writes these two types of errors: PHP Notice: Undefined variable: .... PHP Notice: Undefined offset: 0 ..... Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217243 Share on other sites More sharing options...
PFMaBiSmAd Posted May 18, 2011 Share Posted May 18, 2011 Php code shouldn't produce any type of error, warning, or notice during its normal execution. Only for abnormal things like a legitimate visitor doing something that your code didn't take into account or a hacker trying to break into your script. Is there some reason you haven't corrected each problem that is producing an error? And you do realize that the display or logging of each error is only that last step in php's error response code. Php must still detect and handle each error as it occurs and it probably takes 10-20 times longer to execute each statement that produces an error than it would take if you corrected the problem causing the error. Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217245 Share on other sites More sharing options...
etrader Posted May 18, 2011 Author Share Posted May 18, 2011 The problem is exactly associated with this error [<a href='function.file'>function.file</a>]: failed to open stream: Connection timed out in The problem is related to $file="http://domain.com"; $content=file($file); When removing these lines the execution time reduces to 1s. The problem is merely related to file(). It seems it does not work in my settings (probably that of Virtualmin). Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217247 Share on other sites More sharing options...
PFMaBiSmAd Posted May 18, 2011 Share Posted May 18, 2011 Is this file you are reading really a remote file on someone else's server or it is a local file on your server that you should be reading through the file system? Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217252 Share on other sites More sharing options...
pornophobic Posted May 18, 2011 Share Posted May 18, 2011 If you are using file() to open a url you need to set allow_url_fopen to 1 or true in php.ini But, even better than using file() is using file_get_contents(). It usually has a lesser memory imprint than file() and can open remote files and load them into a string, without having to change your php.ini configuration. Also note that if your php script is loading remote files your script will not load until it has finished receiving the remote content and doing whatever with the data collected that you decide. Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217253 Share on other sites More sharing options...
etrader Posted May 18, 2011 Author Share Posted May 18, 2011 The file is a simple text file containing keywords, and I want to put them into an array; this is the reason that I use file() instead of file_get_contents(). It's mine but not on the same server. Also note that if your php script is loading remote files your script will not load until it has finished receiving the remote content and doing whatever with the data collected that you decide. Is there a way to avoid this? I use this remote file for creating a side widget, which is not the main content. I do not want to delay the main content for it. Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217272 Share on other sites More sharing options...
pornophobic Posted May 18, 2011 Share Posted May 18, 2011 The file is a simple text file containing keywords, and I want to put them into an array; this is the reason that I use file() instead of file_get_contents(). It's mine but not on the same server. Depending on how the text file is formatted, you could use things like explode() to put these into an array and still save memory and load time. If you really insist on using file() you will need to change your php.ini how to Is there a way to avoid this? I use this remote file for creating a side widget, which is not the main content. I do not want to delay the main content for it. You can add in Output Control into your scripts which would require some time and effort, but be worth it in the long run. It will give you greater control of how your site loads. OR You can use AJAX to load said keywords. If the keywords are going into your <head> tags, you won't be able to do this since it just wouldn't be effective and SE spiders don't read generated javascript. You did say it was for a widget, though so using AJAX would be absolutely ideal for a widget that loads the keywords in after/while the rest of the page has loaded. Usually servers have a faster line speed than most domestic lines so if your text file is smaller, it shouldn't be taking that long to load at all. You could also use file caching for your keywords which ould grab keywords and update every so often depending on what you want it to do. Those are all suggestions, but AJAX would be the way I would go. Quote Link to comment https://forums.phpfreaks.com/topic/236788-slow-php-process-for-concurrent-users/#findComment-1217279 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.