Jump to content

Barand

Moderators
  • Posts

    24,563
  • Joined

  • Last visited

  • Days Won

    822

Everything posted by Barand

  1. Do you mean $a = array ( 'AA_BB_CC' => 1, 'AA_BB_DD' => 2, 'AA_BB_EE' => 3, 'AA_CC_DD' => 4, 'AA_CC_EE' => 5, 'AA_DD_FF' => 6 ); $b = array(); foreach ($a as $k => $v) { $keys = explode('_', $k); $b[$keys[0]][$keys[1]][$keys[2]] = $v; } echo '<pre>',print_r($b, true),'</pre>'; giving: Array ( [AA] => Array ( [BB] => Array ( [CC] => 1 [DD] => 2 [EE] => 3 ) [CC] => Array ( [DD] => 4 [EE] => 5 ) [DD] => Array ( [FF] => 6 ) ) )
  2. If you only fetch one result then that is all that will be displayed. Use a while loop to loop through the returned records $today = date("m/d"); $res = mysql_query("SELECT * FROM members WHERE dob LIKE '$today%'"); while ($rw = mysql_fetch_array($res)) { echo "{$rw['dob']}<br/>{$rw['name']}<br/>$today<br/><br/>"; }
  3. if your date is stored as 12/17/2014 why would you expect 12-17 to find a match? try finding those that begin with "12/17" $today = date("m/d"); $result = mysql_query("SELECT * FROM members WHERE dob LIKE '$today%'"); Dates should be stored in yyyy-mm-dd format in DATE type fields for maximum functionality, you cannot even sort dates like yours correctly.
  4. Unix timestamps number from 1970-01-01. Lets try days instead of seconds SELECT FROM_DAYS(AVG(TO_DAYS(playerDOB))) as dob
  5. Yes, it was luck. I add a load more test dates and it failed. try SELECT DATE(FROM_UNIXTIME(AVG(UNIX_TIMESTAMP(playerDOB)))) as dob
  6. Use LEFT JOINS. Where there is no match you get null values EG SELECT table1.order_no as t1 , table2.order_no as t2 , table3.order_no as t3 FROM table1 LEFT JOIN table2 ON table1.order_no = table2.order_no LEFT JOIN table3 ON table1.order_no = table3.order_no +----+------+------+ | t1 | t2 | t3 | +----+------+------+ | 1 | 1 | 1 | | 2 | NULL | 2 | | 3 | 3 | NULL | +----+------+------+
  7. Well, the definition of test data is "That data for which the program works". Perhaps I was just lucky. Can you attach a dump of your player table
  8. Did some experimenting and, amazingly, this seems to work SELECT * FROM testtime; +----+------------+ | id | start | +----+------------+ | 1 | 2014-05-22 | | 2 | 2014-07-24 | +----+------------+ SELECT CAST(ROUND(AVG(start),0) as DATE) as average FROM testtime; +------------+ | average | +------------+ | 2014-06-23 | +------------+
  9. if those values are in the database you can do it the query SELECT start , type , start + INTERVAL type MINUTE as end FROM testtime; +---------------------+------+---------------------+ | start | type | end | +---------------------+------+---------------------+ | 2014-05-22 09:16:24 | 30 | 2014-05-22 09:46:24 | | 2014-05-22 09:56:24 | 20 | 2014-05-22 10:16:24 | +---------------------+------+---------------------+
  10. Also you only need two connections if the databases are on on different servers (as a connection is to a server).
  11. You appear to have a structure like this +--------------+ +--------------+ | player | | match | +--------------+ +--------------+ | | playerid matchid | +-------------------+ | +-----------------<| appearances |>--------------+ | +-------------------+ | | | | | | +-------------------+ | +-----------------<| substitutions |>--------------+ +-------------------+ so you going to need all the players (from appearances and substitutions) that were involved in the match. This will require a UNION and not a JOIIN (You could consider combining these table into one table with an extra column denoting Appearance or Substitution) SELECT P.PlayerDOB AS dob, CONCAT(P.PlayerFirstName, ' ', P.PlayerLastName) AS name, P.PlayerID AS id FROM tplss_players P INNER JOIN ( SELECT AppearancePlayerID as PlayerID , AppearanceMatchID as MatchID FROM tplss_appearances UNION SELECT SubstitutionPlayerID as PlayerID , SubstitutionMatchID as MatchID FROM tplss_substitutions ) as total USING (PlayerID) INNER JOIN tplss_matches M USING (MatchID) WHERE M.MatchDateTime = '$matchdate' ORDER BY dob ASC LIMIT 0,1
  12. Cut and paste EDIT: Here's the spoonfeeding answer //Split a filename by . $filenames = explode(".", $r->swishdocpath); //get 3 chars from $filenames to $country <-----------+ $country = substr($filenames[1],1,3); | echo 'Country: '.$companies[$wvb_number]['country']."<br />"; | //echo 'Country Name: '.$country."<br />"; | //$filenames[2] = explode(".", $r->swishdocpath); | $year = substr($filenames[2],0,4); | echo 'Year: '.$year."<br />"; | //$filenames = explode(".", $r->swishdocpath); | $wvb_number = substr($filenames[1],1,12); //------- MOVE ME ----+ echo 'WVB Number: '.$wvb_number."<br />"; echo 'Company Name: '.$companies[$wvb_number]['company'];
  13. You need to get the wvbnumber before you use it to access the country name
  14. The data model makes this assumption: section is part of a class containing a group of pupils who all take the same subjects and therefore have a common timetable schedule. The principal table in the model is period which is a part of day where a subject is taught by a teacher to a section (group of pupils) in a room. This is the main table that will be queried to produce your timetable (with joins to the other tables to collect subject names, teacher names etc). The advantage of such a model is that you can now do far more than just timetables EG which pupils are in a class which subjects does each pupil take which pupils take a particular subject which teachers teach a particular subject which pupils does a teacher teach when is room available where is a pupil or teacher at a particular day/time which are the most/least popular subjectsetc
  15. $file = '/var/www/html/active_colist.csv'; $fh = fopen($file, 'r'); $companies = array(); $row = fgetcsv($fh, 1024); // ignore header while ($row = fgetcsv($fh, 1024)) { $companies[$row[0]] = array('company' => $row[1], 'country' => $row[3]); // CHANGED LINE } fclose($fh); Then, to display the company it will now be echo 'Company: ' . $companies[$wvbnumber]['company]; and for the country echo 'Country: ' . $companies[$wvbnumber]['country'];
  16. Your db table design is more like a spreadsheet than a database table. You should also learn about "data normalization" so you can correctly design your db tables. What is "section" in your data? Is it a section of a class?
  17. $minprice = 5000; // would come from user input $maxprice = 10000; $sql = "SELECT DISTINCT prop.propertyId , Title , ImageUrl , Location , Bedrooms , Bathrooms , Parking , Price FROM PROPERTIES as Prop LEFT JOIN IMAGES as Img ON Img.PropertyId = Prop.PropertyId WHERE Price BETWEEN $minprice AND $maxprice";
  18. http://www.php.net/manual/en/features.file-upload.multiple.php
  19. True. Begs the question - What is the code and what is it's relationship to the two entities?
  20. Given those sample questions I would say the code belongs in the question table adventure_question ------------------ q_id (PK) q_code q_number question and users_adventures ----------------------- ua_id (PK) ua_username q_id (FK) ua_score
  21. Of course you can. Just don't forget to include a purchase order and billing address. If you keep it the forums the advice is free.
  22. echo 'WVB Number: '.$wvbnumber."<br />"; echo 'Company Name: ' . $companies[$wvb_number]; add ^ change^ Also check the $companies array was created OK echo "<pre>", print_r($companies, 1) . "</pre>"; If it's empty it could be an error in the file path.
  23. Storing the csv data would be done only once so that can go anywhere before you need the data. Echoing the company name would go, well, where you want to echo the company name.
  24. You could store the csv data in an array $file = '/var/www/html/active_colist.csv'; $fh = fopen($file, 'r'); $companies = array(); $row = fgetcsv($fh, 1024); // ignore header while ($row = fgetcsv($fh, 1024)) { $companies[$row[0]] = $row[1]; } fclose($fh); then, when you have the $wvb_number echo $companies[$wvb_number];
  25. Must get a new battery for my crystal ball ORDER BY upvotes - downvotes
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.