Jump to content

Concurrent connections per file


Eren

Recommended Posts

Hey! I am planning to make a not so complicated online resource management game in PHP. I'm gonna use a shared hosting and my hosting provider limits my websites concurrent connections to 100 (due to protection). They say that if there are for example 105 connections at the same time, the last 5 will have to wait for others requests complete. That's understandable. But I'm wondering if this applies to PHP pages as well.

 

For example, let's say that I have a return_resources.php that returns the data taken from MySQL in JSON format. And assume that each request takes 0.2 seconds. What happens if there are 105 requests at the same time? Will it return the MySQL data for first 100 users and then the other 5 (assuming that there is no request to other files)? Or will it return the data one by one (this is what I'm afraid of) that could make the last user wait 21 seconds?

 

If the second option is the answer, is there a way to make the code async to make it able to serve more users at the same time?

 

Thanks.

Link to comment
Share on other sites

Let's start with the basics here. Limit of connections from what to what?

 

Who is your host, and where is this documented?

 

My best guess is that this refers only to connections from the InterWeb to your web server. So I would assume that means that they will allow you to have 100 TCP connections at any one moment.

 

When the 101st client connects, it is not clear what will happen, because we don't have any information on how this connection limit is handled. It could be that the client does not wait, but instead receives an error message stating their connection was refused.

 

You should look into load testing systems and simulate that to find out exactly what will happen. For example, apache bench is a simple one but there are probably a gazillion other options you can find with a little googling. Here's one written in Python for example: https://github.com/tarekziade/boom

 

There is nothing you are going to be able to do in code to get around the limitations your host is enforcing. With that said, having 100 concurrent users would mean you have a fairly successful game. It could be a long long time before you ever hit that threshold.

Link to comment
Share on other sites

Let's start with the basics here. Limit of connections from what to what?

 

Who is your host, and where is this documented?

 

My best guess is that this refers only to connections from the InterWeb to your web server. So I would assume that means that they will allow you to have 100 TCP connections at any one moment.

 

When the 101st client connects, it is not clear what will happen, because we don't have any information on how this connection limit is handled. It could be that the client does not wait, but instead receives an error message stating their connection was refused.

 

You should look into load testing systems and simulate that to find out exactly what will happen. For example, apache bench is a simple one but there are probably a gazillion other options you can find with a little googling. Here's one written in Python for example: https://github.com/tarekziade/boom

 

There is nothing you are going to be able to do in code to get around the limitations your host is enforcing. With that said, having 100 concurrent users would mean you have a fairly successful game. It could be a long long time before you ever hit that threshold.

 

Thanks for your reply! Let me translate it directly:

 

Entry Process limit

 

Entry Process is a block to DDoS and Botnet attacks. It is known as "Apache concurrent connection limit". Each PHP process, crontab or SSH connection is considered as 1 process. The limit is 100 and after 100 connection, other connections from the website will wait at loading process and wait in the queue for other processes to finish.

 

Also, 100 concurrent connections are pretty enough to me. But I was wondering what would happen if there would be a function queue or not.

 

Basically what I'm asking is, if there are 100 requests to one specific file which returns data from MySQL, would the server execute the same function with different queries 100 times at the same time or would it wait for other function to finish.

Link to comment
Share on other sites

Given this critera, no, things are not serialized. Each child process is concurrent. However, a queue is inherently serialized, but if 3 processes complete and allow for 3 new PHP processes to execute, they will all execute just as fast as they are able to, regardless of whether they are all trying to run the same script.

 

I will say that there are a number of rules of thumb you should consider for application design. For example,

 

  • you should try and cache/minimize the re-reading of data.  Might be hard on a shared host, but this is why people use redis, memcache or something else.  
  • You should have a production opcode cache turned on.  Modern versions of php have it built in.  In the older days most people used apc.  You should investigate what your host is providing for you as a platform.
  • If you can do so, you should use InnoDB with mysql, ideally with a substantial innodb_buffer_pool allocation. Since you are on a shared host, you won't have control of this, but it's possible that things might run better for you if they have provided a good size cache to be used by everyone sharing the mysql server.  InnoDB will actually cache the data returned by a query, and it is much better for mysql to be reading data from cache than having to go to disk, especially if you can't utilize an intermediary cache.
  • Utilize queues to handle things that should be asynchronous.  There are any number of these types of situations, but just as a quick example, if your system sends emails to users for events like registration, lost password etc., those should be queued.  That allows the response to the user to be immediate and allow them to get back to whatever else they might be doing.  Of course, again you need some sort of queue.  In a highly limited environment the best you might be able to do is utilize a db.  There is a library that provides support for multiple queue platforms you could use:  https://github.com/bernardphp/bernard.  Initially you would have to use the Doctrine/PDO driver, but in the future you would be able to upgrade on a new host by simply changing the driver to some other supported queue platform, once you had a vm or AWS.
Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.