Okay, cheers
Update:
So are there 'any' instances where it be more efficient to grab all the data (and move the processing to PHP) in a large scale frequently accessed system? I've seen it done for some search cases so it must be beneficial sometimes?
and, Would it be appropriate to split up a data table with minimum tens of thousands of rows to speed up data access.
e.g. By splitting an users tables into 26 tables (users_a, users_b, users_c, ... users_z) or even into 676 tables (users_aa, users_ab, ... users_ba, users_bb, ... users_zzz), or is this bad practice?