Jump to content

asmith

Members
  • Posts

    967
  • Joined

  • Last visited

Posts posted by asmith

  1. Hi,

     

    I have this table that hold records like posts. Each post (row) have a time that it has been posted in. This time was  php time() when recording.

     

    I wanna get some stats about posts, Like how many were posted in each day. So I'm thinking about a simple group by and counting the rows. Is it possible to group data that does not have same value but are between value a and b?

     

    How would you write a query for that?

    Thanks for your time

  2. There's a submit button in your form that the user clicks on it to upload his file. It looks like this:

     

    <input type="submit" name="submitButton" value="upload file" />

     

    To check if the form has been submitted:

     

    if (isset($_POST['submitButton']))
    {
    if (empty($_FILES['userfile']['name'])) {
       echo '$var is either 0, empty, or not set at all';
    }
    }
    

  3. Hi,

     

    I have a search form like this:

    <form action="" method="post" accept-charset="UTF-8" style="margin: 0;">

    <input type="text" name="search" style="width: 190px;" value="'.$_POST['search'].'" />

     

    When I type    character in it. It searches the database and apparently find things with these character. But it shows this character as €  (Convert it when page reloads for search)

     

    Why is this happening? and how can I prevent it?

  4. I don't recommend this way of coding.

     

    But you can modify it so that it gives you what you need:

    I call the checkbox field "chkField"

     

     

    <?php
    if(!empty($form_id)){
          $sql = "UPDATE ".$form." SET ";
          foreach($_POST as $key => $value){
             if(($key != 'submit_'.$form ) && ($key != 'password2') && ($key != 'applicant_id') ){
                $sql .=  $key. " = '$value',";
             }
          }
    
          if (!isset($_POST['chkField']) || $_POST['chkField'] == '')
               $sql .=  "chkField = '',";
    
          $sql = substr($sql, 0, -1);
          $sql .= " WHERE ".$applicant_id." = $applicant_id";
          $result = mysql_query($sql,$db) or die(mysql_error(). "<br />SQL: $sql");   
    }
    ?>

  5. You haven't showed us the part when it wants to changes the value of them field in the database.

     

    But the confirmation will be like this:

     

    if($screenstatus == "Declined") {
    print"<br> Screening: <font color=\"red\">$screenstatus</font></b>&nbsp&nbsp&nbsp<input type=\"submit\" name=\"approve\" value=\"Approve\" onclick=\"return confirm('Are you sure you want to approve?');\"><hr>\n";
    

     

    Notice that it is a javascript code. (You can't fully rely on it, Cause it can be disabled)

     

     

  6. I assume the id field is referring to another table and the id means id post of that table?

     

    table2 = the table which contains the posts

    table1 = this table you showed us

     

    SELECT t2.*, count(t1.submittedby) s numberOfPosts from TABLE2 as t2 INNER JOIN table1 as t1 ON t2.id = t1.id GROUP BY t1.submittedby

  7. Hi,

     

    I have this table that has recorded the times which each member has done in a sport:

     

    ID_MEMBER    ID_SPORT  hisTime

    12                32            45.25

    14                51            41.52

     

    For the member page, I want to run a query to find out if the specified member is being 1st, 2nd or 3rd in EACH ID_SPORT. (kinda like checking to see if he has any medal)

     

    Its possible that a member have 2 or more times per ID_SPORT.

     

    For example if the member have the 1st and 2nd time of a ID_SPORT he will be counted as 1st (But not second), and the third time will be calculated as 2nd. (If it is by other member)

     

    This table's gonna be so big. Over 10, 000 rows.

    How can I write the fastest query possible?

     

    Thanks for your time 

     

     

  8. Your table structure is not well for this matter. However I'll show you in your way and I'll offer a better way.

     

    Your way:

    You can simply do a long where clause query:

    SELECT * FROM TABLE WHERE P1ID>0 AND P2ID>0 AND P3ID>0 AND P4ID>0 AND P5ID>0 AND P6ID>0 AND P7ID>0 AND P8ID>0 AND P9ID>0 AND P10ID>0  
    

     

    I don't know your whole table fields, But:

    Better way:

    Make another table with 3 fields:

     

    ID_FIRST_TABLE_ROW

    P_IDs

    P_IDs_VALUE

     

    Now insert the P_ID in the second field and its value in the third.

  9. Hi,

     

    In some table I have some data about members which their status is 1 or 0.

    I want to get members + how many accepted and non accepted document they have:

     

    SELECT

    m.ID_MEMBER, m.name,

    IFNULL(count(d1.ID_DOC), 0) as accepted,

    IFNULL(count(d2.ID_DOC), 0) as notAccepted

    FROM members AS m

    LEFT JOIN documents AS d1 ON d1.ID_MEMBER = m.ID_MEMBER AND d1.isAccepted = 1

    LEFT JOIN documents AS d2 ON d2.ID_MEMBER = m.ID_MEMBER AND d2.isAccepted = 0

     

     

    It is viewing wrong result to me.

    I narrowed it down to 1 member (put where clause), and see it was viewing '6' for both accepted and nonAccepted while I had 3 accepted and 2 non accepted for that member.

     

    Any idea how to fix it?

  10. Hi,

     

    Just wondering if something like this is possible:

     

    SELECT field1 FROM table1, table3.member as membersField

    INNER JOIN (SELECT member FROM table2 ON table1.ID = table2.ID) as table3 ...

     

    So that instead of getting multiple rows as each member, it concat member filed in to one for each row of table1.

     

    Something like this: (For result)

     

    field1    membersField

    d1        tom, john, alex

    d2        richard, jack

     

    instead of

    d1  tom

    d1  john

    d1  alex

    d2  richard

    d2  jack

     

    Is it possible?

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.