Jump to content

Slow php process for concurrent users


etrader

Recommended Posts

I have a simple php script without mysql database. Normally it takes less than 1s to complete the process. But in high traffic (more than 10 concurrent processes), the execution time dramatically increases to 10-30s. This itself causes an increase in the concurrent processes. I think this behavior is not very common. Any idea?

 

Btw, can huge error logs make the system slow? I have numerous php notices and it's growing.

Link to comment
Share on other sites

Disk space isn't generally relative to system speed. If you have tons of errors in your scripts and you aren't fixing them, you could be slowing down the system because it's so busy logging errors.

Other things that you could consider are the quality of your host, what type of hosting is it? Free? Shared? VPS? Dedicated? Cloud?

Are there lots of includes or requires in the script?

Does the script do a lot of file writing?

etc.

Link to comment
Share on other sites

Thanks for your attention. I tried on three different VPSs running CentOS 5.5 and Virtualmin. I checked with different php codes, the main delay appears when it writes these two types of errors:

 

PHP Notice:  Undefined variable: ....

PHP Notice:  Undefined offset:  0 .....

Link to comment
Share on other sites

Php code shouldn't produce any type of error, warning, or notice during its normal execution. Only for abnormal things like a legitimate visitor doing something that your code didn't take into account or a hacker trying to break into your script.

 

Is there some reason you haven't corrected each problem that is producing an error?

 

And you do realize that the display or logging of each error is only that last step in php's error response code. Php must still detect and handle each error as it occurs and it probably takes 10-20 times longer to execute each statement that produces an error than it would take if you corrected the problem causing the error.

Link to comment
Share on other sites

The problem is exactly associated with this error

 

[<a href='function.file'>function.file</a>]: failed to open stream: Connection timed out in

 

The problem is related to

$file="http://domain.com";
$content=file($file);

 

When removing these lines the execution time reduces to 1s.

 

The problem is merely related to file(). It seems it does not work in my settings (probably that of Virtualmin).

 

Link to comment
Share on other sites

If you are using file() to open a url you need to set allow_url_fopen to 1 or true in php.ini

 

But, even better than using file() is using file_get_contents().

It usually has a lesser memory imprint than file() and can open remote files and load them into a string, without having to change your php.ini configuration.

 

Also note that if your php script is loading remote files your script will not load until it has finished receiving the remote content and doing whatever with the data collected that you decide.

Link to comment
Share on other sites

The file is a simple text file containing keywords, and I want to put them into an array; this is the reason that I use file() instead of file_get_contents(). It's mine but not on the same server.

 

Also note that if your php script is loading remote files your script will not load until it has finished receiving the remote content and doing whatever with the data collected that you decide.

 

Is there a way to avoid this? I use this remote file for creating a side widget, which is not the main content. I do not want to delay the main content for it.

Link to comment
Share on other sites

The file is a simple text file containing keywords, and I want to put them into an array; this is the reason that I use file() instead of file_get_contents(). It's mine but not on the same server.

 

Depending on how the text file is formatted, you could use things like explode() to put these into an array and still save memory and load time. If you really insist on using file() you will need to change your php.ini how to

 

 

Is there a way to avoid this? I use this remote file for creating a side widget, which is not the main content. I do not want to delay the main content for it.

 

You can add in Output Control into your scripts which would require some time and effort, but be worth it in the long run. It will give you greater control of how your site loads.

 

OR

 

You can use AJAX to load said keywords. If the keywords are going into your <head> tags, you won't be able to do this since it just wouldn't be effective and SE spiders don't read generated javascript.

You did say it was for a widget, though so using AJAX would be absolutely ideal for a widget that loads the keywords in after/while the rest of the page has loaded.

Usually servers have a faster line speed than most domestic lines so if your text file is smaller, it shouldn't be taking that long to load at all. You could also use file caching for your keywords which ould grab keywords and update every so often depending on what you want it to do.

Those are all suggestions, but AJAX would be the way I would go.

 

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.