Jump to content

Archived

This topic is now archived and is closed to further replies.

thepip3r

Modification on Query Statement...

Recommended Posts

PHP 5, MySQL 4.?

I've written a site that tracks weekly technician information.  It's a simple but effective site.  The meat of the site is managed by three tables; ids, users, jobs.  When a new job is entered, if multiple technicians work on the job, multiple records are entered into the "ids" table something like, jobID 1, userID 35, userID 50.  So right now the way I query for information is to select all jobs for a specific timeframe (specified by the user-formed query) and that grabs all the jobs in a given unix timestamp range.  After that, I run those through a loop to grab all of the users who may have worked on those jobs to display on the search-results page.

My problem lies with growth.  It's not a problem now but I can see it being a problem in the future.  Information is organized by office name and a couple of other offices besides mine have requested to use it.  That's fine except the problem is that when a job timeframe is queried for, all of the users' jobs from all offices are being looped through to find users from the appropriate office showing the right information.  I CAN do it with complex joins (SELECT * FROM ids RIGHT JOIN users ON users.id=ids.userID RIGHT JOIN jobs.id=ids.jobID WHERE users.officeName='Variable')  The only problem with this is that I get multiple records returned for the same job when multiple users work on that job.  Does anyone know a better way to do it other than that?  If that's the best way, my quandry is finding and efficient way to take the duplicate user information for the same jobs associated in some way so I can present that information to the user in a quasi-readable format.  Any and all help is greatly appreciated.

Share this post


Link to post
Share on other sites
I have just scanned through what you have written. I would have a more detailed look if you list your tables so I see the structure of the data. Could you not use the GROUP BY to get rid of the duplicates

Share this post


Link to post
Share on other sites
Well I tried the GROUP BY clause using this:

SELECT * FROM ids RIGHT JOIN user ON user.id=ids.userID RIGHT JOIN jobs ON jobs.id=ids.jobID WHERE user.officeSymbol='SCBNA' AND jobs.entryDate>='1158537600' AND jobs.entryDate<='2159228800' GROUP BY jobs.id LIMIT 173, 30 

and the problem is that it didn't display each user per job, it only displays the last user.  i need to know ALL users who worked on the job...

here is the dump of my mysql tables:

CREATE TABLE ids (
  id int(11) NOT NULL auto_increment,
  jobID int(11) NOT NULL default '0',
  userID int(11) NOT NULL default '0',
  PRIMARY KEY  (id)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
# --------------------------------------------------------

#
# Table structure for table `jobs`
#

CREATE TABLE jobs (
  id int(11) NOT NULL auto_increment,
  title varchar(75) NOT NULL default '',
  category varchar(22) NOT NULL default '',
  entryDate varchar(20) NOT NULL default '',
  description text NOT NULL,
  duration int(11) NOT NULL default '0',
  PRIMARY KEY  (id)
) ENGINE=MyISAM DEFAULT CHARSET=latin1;
# --------------------------------------------------------

#
# Table structure for table `user`
#

CREATE TABLE `user` (
  id int(11) NOT NULL auto_increment,
  username varchar(75) NOT NULL default '',
  rank varchar(5) NOT NULL default '',
  fname varchar(20) NOT NULL default '',
  lname varchar(20) NOT NULL default '',
  base varchar(20) NOT NULL default '',
  organization varchar(10) NOT NULL default '',
  officeSymbol varchar(20) NOT NULL default '',
  firstAccessDate varchar(30) NOT NULL default '',
  active tinyint(1) NOT NULL default '1',
  admin tinyint(1) NOT NULL default '0',
  PRIMARY KEY  (id)
) ENGINE=MyISAM DEFAULT CHARSET=latin1;

Share this post


Link to post
Share on other sites

×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.