Jump to content

digitalenviron

New Members
  • Posts

    2
  • Joined

  • Last visited

    Never

Everything posted by digitalenviron

  1. Yes, this looks a lot like what I've been looking for. I'll start by using this and will see if I can get further. Obviously I'm inexperienced with OOP which is why I asked this way. Thank you very much for your help. Don't mind about connection pooling, that was something I was wondering about, not really important here. Thanks again.
  2. Hello, I have a question regarding a php/mysql/fc7 application architecture for improving database access performance. The application I am maintaining has high usage (about 25k unique visitors per day) and my purpose is to scale it out. I've set it up with remote database access and have run into intermittent problems with database connections and open files limitation. My question is specifically about the architecture of accessing the database via a class which establishes the connection link. The db object code is like this: class dbd { var $db; // Database type. var $dsn; // Datasource. ...etc /* Constructor. */ function dbd($dsn, $dbLink = NULL, $path = "") { $this->conn = mysql_connect($this->db[4], $this->db[2], $this->db[3]); mysql_select_db($this->db[6]); if(mysql_errno()) { die("Database '".$this->db[6]."' connection failed."); } if ($this->conn) { return 1; } return 0; } /* Destructor. */ function __destruct() { mysql_close($this->conn); } } The db class is then accessed in many functions in the application, always this way: $_DB = new dbd(CFG_DSN); $_DB->query(...); What happens effectively is that for each page request, hundreds of db connections are opened and then closed. Logging mysql I see this happening about 200 times per page: nnn Connect user@host on nnn Init DB my_db nnn Quit But in fact the page may only perform 10-20 queries, so a lot of open connections are wasteful. This is no problem when running mysql locally, but when running mysql on a dedicated server, soon the connections start to fail under load ( I get client error code 2003, cannot connect. Also I see in the webserver logs that often it is unable to open any more files). I've played with system open files limits (this is on fedora core 7) and also tried some hacks to retry the open when it fails, but really I am not sure that the db access code is soundly implemented in the application. The problem that I need to address is: 1. is the db object implementation visibly wrong? 2. how can db access be performed more efficiently? What I'd like to see is that only one (or just a few) connection is open per session, destroyed at session end. 3. Is there a way to pool these connections so that if there is a burst of new sessions, the old ones can be destroyed automatically? If someone has some quick pointers, please let me know. Otherwise if there is someone with concrete experience in this area, please contact me so we can work out a solution. TIA
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.