Jump to content

any way to ignore a value from $result = select * from 'tableName'; ?


schalfin

Recommended Posts

OK, I'm trying to learn php and I sometimes get overwhelmed and frustrated and will overlook the obvious. If thats the case here, I apologize in advance.

 

I want to display entries from a table but I do not want the most recent entry to display. I'm not sure how to query my database table to ignore the most recent entry.

 

Basically my thinking is that I want to do a $result = @mysql_query("select id, entry_date, entry_text from myTable WHERE <and some parameters to ignore the most recent entry in table>");

 

I'm sure there is a way to identify the most recent entry by id or entry_date or to ignore an array key value but I haven't been able to successfully do that just yet.  Any information that might be useful would be appreciated. One more note, the number of entries will constantly increase as the table gets updated with information.

 

Thanks

Scott

 

 

Link to comment
Share on other sites

$sql= "SELECT * FROM table ORDER BY Date DESC" // Most recent first

$query = mysql_query($sql) or die("Error101");

while ($row = mysql_fetch_array($query))

{

  // read & display data here // echo $row['Surname'];

}

 

if you don't want the first row which is the most recent then ignore it.

 

 

Link to comment
Share on other sites

Sorry if you want to forget the first record

 

$sql= "SELECT * FROM table ORDER BY Date DESC" // Most recent first

$query = mysql_query($sql) or die("Error101");

$first = 0;

while ($row = mysql_fetch_array($query))

{

    if($first == 1)  // ignore first record

        // read & display data here // echo $row['Surname'];

    $first = 1;

}

 

Link to comment
Share on other sites

Sorry if you want to forget the first record

 

$sql= "SELECT * FROM table ORDER BY Date DESC" // Most recent first

$query = mysql_query($sql) or die("Error101");

$first = 0;

while ($row = mysql_fetch_array($query))

{

    if($first == 1)  // ignore first record

        // read & display data here // echo $row['Surname'];

    $first = 1;

}

 

what you've done here is actually ignored the second row.  see, you set $first=0, so the first loop will execute normally (which is also the first record that is supposed to be ignored), then (if $first++ exsisted), the second iteration of the loop would be true since $first now equals 1, and would ignore what is now the second row.
Link to comment
Share on other sites

How about:

select id, entry_date, entry_text from myTable WHERE 
entry_date < (SELECT MAX(entry_date) FROM myTable)

 

If you don't want certain data, tell the database not to send it.  Don't try to code around it.  Of course, this will ignore ALL rows that have the greatest entry_date.  If two rows end up with the same entry_date, they will both be ignored.  You could use

 

select id, entry_date, entry_text from myTable WHERE 
id < (SELECT MAX(id) FROM myTable)

if the id column is an auto-increment column.  It's not best-practice, but it produces exactly the result you specified.

Link to comment
Share on other sites

Hi

 

If you have them in descending order you can just use limit

 

$result = @mysql_query("select id, entry_date, entry_text from myTable ORDER BY entry_date DESC LIMIT 1,9999999");

 

If you use a parameter and had 2 most recent records with the same entry date you would ignore both. The above would ignore one of them (random one).

 

Or you could just do:-

 

$sql= "SELECT * FROM table ORDER BY Date DESC" // Most recent first
$query = mysql_query($sql) or die("Error101");
if ($row = mysql_fetch_array($query))
{
while ($row = mysql_fetch_array($query))
{
	//Do processing
}
}

 

All the best

 

Keith

Link to comment
Share on other sites

Why waste resources (database and php) retrieving and processing data that is going to be ignored?  It is much more efficient not to send the data "across the wire" in the first place.  Yes, the code is perfectly capable of skipping a row; and for a one-time query, it probably makes no difference.  But in a page, that could be requested by any number of users all over the world concurrently, we should write the most efficient code possible.  Database engines are designed to store and retrieve data, they do it very efficiently.  Use the correct tool for the job.

Link to comment
Share on other sites

True.  In this particular case.

 

But why not get in the habit of writing effecient code so when a case comes up that requires 50 records to be ignored, you won't even consider using the front-end to skip over them?

 

I've been in the programming and database world for more years than I care to mention.  And one of my pet peeves, is lazy programmers.  In fact this whole world of interpreted languages that allow everybody and their brother to write code drives me nuts.  I blame Bill Gates!  Macros were fine until he got a hold of them, and turned it into a language and made everyone think they could write code.  People who have no idea what logic is, are writing sloppy code and thinking they are "programmers".  One of my philosophies of life is:

You only have to write the code once.  The user has to use it repeatedly.  Write effecient code, not quick-sloppy code.

 

Of course, if you own M$ or Intel, you love bloated code, because the users will have to buy more computing power to run it, which you are glad to sell, with a bloated O/S that requires more computing power ...

 

... climbs down off soapbox :-\

 

Link to comment
Share on other sites

Hi

 

Fair point, although I would wonder if the need to interpret and process a LIMIT statement might well exceed the saving from avoiding the extra fetch. As such it could be the "lazy" way to code it using limit ;) .

 

In a similar way to your feelings, I am not that happy with using the limit statement with some random high value for the 2nd parameter. Just seems like a bodge and plain wrong.

 

No solution is particularly nice for this, you just have to decide which suits a particular situation.

 

All the best

 

Keith

Link to comment
Share on other sites

Agreed -- if the LIMIT offset is more than a few percent of the total number of records returned, it's a waste.

 

Also, while I agree that's it's great to program for the future -- and I've been programming and working with databases for about 15 years -- if you look too far into the future, nothing ever gets done.

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.