benphelps Posted July 18, 2009 Share Posted July 18, 2009 I have a script that forks out PHP processes and each one makes a new MySQL connection. I fork the processes because each one could run anywhere from 0.5sec to 4-5min and I need them all to complete as fast as possible and I cant wait for the longer running jobs to complete in a while loop. Each batch could start from 80-200 processes. Each processes makes about 10 MySQL changes. When I start the fork processes MySQL RAM usage jumps up to about 300mb. Would using MySQLi speed up the processes or lower memory usage? Any insight will be appreciated. Quote Link to comment Share on other sites More sharing options...
redarrow Posted July 18, 2009 Share Posted July 18, 2009 i think the way your updating the information is going to always be a long time. is the project using curl? Quote Link to comment Share on other sites More sharing options...
benphelps Posted July 18, 2009 Author Share Posted July 18, 2009 Yes, it uses cURL. Quote Link to comment Share on other sites More sharing options...
redarrow Posted July 18, 2009 Share Posted July 18, 2009 i think the flat file would make no difference, because off all the info needed collecting then. the only thing i can think off is faster bandwidth. sounds like each fork the program takes, there bandwidth is busy causing your program to slow down? Quote Link to comment Share on other sites More sharing options...
benphelps Posted July 18, 2009 Author Share Posted July 18, 2009 The server is on a 100/100mb/s connection and most of the pages are blank or have little output, the server might peak at about .5-1mB/s when the batch runs. I'm just trying to improve the performance of the system. Quote Link to comment Share on other sites More sharing options...
redarrow Posted July 18, 2009 Share Posted July 18, 2009 Can i no why you need to use curl in this instance please. looks like your using curl for a unknown reason. Quote Link to comment Share on other sites More sharing options...
benphelps Posted July 18, 2009 Author Share Posted July 18, 2009 Simulated cron jobs: http://cronless.com/ I'm up to about 6,000 jobs (about 6,500 users) and I'm noticing a performance drop and looking to speed it up. Quote Link to comment Share on other sites More sharing options...
nadeemshafi9 Posted July 18, 2009 Share Posted July 18, 2009 if your doing any select statments put an index on the tables. create a singleton connection class store the instance in session then controle all the processes connections from that one method if you can. try to do any processing in php rather than sql, reduce regex, you could cluster some servers and connect to different ones using the singleton class that way you will have different mysql servers you work with but all of them have the same data in the connection class check to see if there is a connection to one server if there is then connect to cluster server 2 instead etc etc you could limit the no of processes see if that helps, limit the no of queries made and put them on the backburner Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.