Jump to content

Recommended Posts

I'm finding more need paginate results and allow for ascending/descending column sorting.

 

I know how to do this with SQL (limit and order by).

 

The problem I'm facing is that most returned values have calculations applied to it in a PHP loop and thus I cant sort by them on the SQL level

 

For Example:

 

$output = array();
foreach ($db_results as $row) {
  $output['column'] = fancyCalculation($row['some_column'])
}

 

The "fancyCalculation" funciton includes other SQL calls and bunch of stuff that would be difficult to incorporate into a JOIN or built in MySql math functions.

 

I've been able to sort SOME columns using HUGE sql queries that are like 30 lines long have a ton of Joins and take forever to run. So its not impossible, but highly impractical.

 

In my OOP world, I like keeping things simple. I like running functions on returned DB values, etc. instead of having huge sql queries.

 

I've experimented with processing the entire  result set and then sorting the resulting array using usort or something, but on huge result sets it takes forever and is also impractical.

 

My overreaching question is: Does anybody have any philosophy for dealing with this kind of thing?

Humm that will solve at least one of my problems. I'll still need to run calculations on the entire dataset making the first page load very time consuming, but I don't think there's any other way around that.

 

I'll play around with it, thanks!

There's a lot of foreign key stuff. Like the sum of everything from one table, multiplied by a percentage found in another table, etc. If you're a SQL genius you might be able to do all this in a single query with a whole lot of joins, sub-queries, and using mysql's math functions, but it ends up being a monster sql that takes forever to run and isn't very organized. And there are still some calculations that I absolutely can not figure out how to do in SQL alone.

You know, you are going to have to fine tune this one.  It isn't going to be solved by doing everything in php or everything in MySQL.  One of the best things you can do if the data is fairly static is to precalculate part of the dataset and store it.  Such that everything older than a day (so that it doesn't change) gets calculated and put the values into a table.  Then modify the values for what is happening today on the fly.  Cron a job to update the pre-calc tables daily. 

 

There is no easy solution for this one, and you are going to need to fail a few times before you will get going in the right direction.  I'd spend some time with a white board personally.  Get the whole thing mapped out and start seeing if there is anything that can be done to optimize it.  Go to user group (bring the whiteboard) and show them what you got.  I'm sure there will be someone with insight, but this might be too much for an internet forum.  Doesn't hurt to post a few specifics though.

Thanks for the info, pre-calculating with a cron or caching the dataset could definitely help some of these. As you say, i'm gonna have to whiteboard this one out.

 

I'll post an example here of what I'm trying to accomplish later today. Its a bit long-winded and may take some time to put together in an understandable way, even in code form.

 

Thanks for the help guys

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.