fivestringsurf Posted September 6, 2019 Share Posted September 6, 2019 (edited) Hi, I am doing some scraping and processing in php that is time intensive. On my local machine it was no big deal, but now that the project is live and has realtime users daily I have some concerns. When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ? I hope not, but my local testing seems like it IS that way. I did this: I ran a php script from my browser that took a few minutes to execute. While it was running I tried accessing the same local site via another browser tab and it halted. If this is in fact how php works on the live server, how would I go about running about 6000 processes daily that would consume ~1-4 hours of processing time? Thanks. Edited September 6, 2019 by fivestringsurf Quote Link to comment Share on other sites More sharing options...
ginerjm Posted September 6, 2019 Share Posted September 6, 2019 (edited) Perhaps you need to examine your long-running process for better ways to perform whatever task you are doing. Are you using repetitive queries instead of one over-all one? Are you looping with an embedded query? These are things to avoid. A process that takes "a few minutes" to execute is either working on hundreds of thousands of records or is not written properly. Edited September 6, 2019 by ginerjm Quote Link to comment Share on other sites More sharing options...
fivestringsurf Posted September 7, 2019 Author Share Posted September 7, 2019 12 hours ago, ginerjm said: Perhaps you need to examine your long-running process for better ways to perform whatever task you are doing. Are you using repetitive queries instead of one over-all one? Are you looping with an embedded query? These are things to avoid. A process that takes "a few minutes" to execute is either working on hundreds of thousands of records or is not written properly. Yes and yes. I've already broken the "tasks" down into never more than a single http request if there is one. But when I go to complete a bunch of tasks they can all stack up and take a while. The "few minutes" was me deliberately making a task take longer so I could test my theory locally about php locking up for other's requests...which is still a question I have: When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ? Quote Link to comment Share on other sites More sharing options...
requinix Posted September 7, 2019 Share Posted September 7, 2019 15 hours ago, fivestringsurf said: When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ? Not even remotely the case. Locally, were you using the built-in server that PHP provides? Don't. It's good for quick stuff but it's not a real server. Set up your development environment to match your production environment as closely as possible. 1 Quote Link to comment Share on other sites More sharing options...
fivestringsurf Posted September 7, 2019 Author Share Posted September 7, 2019 3 hours ago, requinix said: Not even remotely the case. Locally, were you using the built-in server that PHP provides? Don't. It's good for quick stuff but it's not a real server. Set up your development environment to match your production environment as closely as possible. Oh that's good to know Locally I was using the apache that ships with MacOS. I'm currently running the local Laravel Server (via: http://127.0.0.1:8000 ) but I also remember this was the exact case when running xampp. But as long as the actual LAMP stack on the server can handle concurrent requests I don't think I have much to worry about. (as an aside) I tried looking up this on google but found it really hard to even word it correctly to get the info I was seeking! Thank you Quote Link to comment Share on other sites More sharing options...
kicken Posted September 7, 2019 Share Posted September 7, 2019 You probably saw the behavior you did due to using sessions. When you start a session PHP will lock the session data so that it's not affected by other processes. This lock exists until you either call session_write_close or the process ends. So if you're long-running process doesn't need to update any session data, call session_write_close prior to starting it. That said, lots of concurrent long-running processes could block a server. Your HTTP server will process each request using a thread or worker process (depends on the configuration). I'll only spin up a certain number of these based on the configuration and if that limit is reached it'll stop responding to requests. Your long-running processes would tie up some of these threads. The number of threads available on an actual server will probably be relatively high though, so unless you expect a lot of these processes to be running concurrently it likely won't be an issue. If the server is setup with something like PHP-FPM or a CGI setup though, the number of allowed PHP instances may be smaller. You limit would be the smaller of the the http server's limits or PHP's limits. If you want to keep your site responsive though, the way to manage that would be to offload the work to a background process so that your website can continue to respond to requests. The user would then go to the page which would trigger the processing and you would respond with a message like "We're working on your request, check back in a bit". When the process is complete give the user the results. There are many ways to accomplish this, such as using services like redis, gearman, beanstalkd, etc or simply adding records to your database and having a background service checking for new records periodically. 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.