Jump to content

Barand

Moderators
  • Posts

    24,563
  • Joined

  • Last visited

  • Days Won

    822

Everything posted by Barand

  1. Like this? SELECT id as ids , job_number , line_item FROM production_data WHERE line_item = '' UNION SELECT GROUP_CONCAT(id separator ', ') as ids , job_number , line_item FROM production_data GROUP BY job_number, line_item HAVING COUNT(*) > 1; [edit] If you want the ids in their own rows, then SELECT id , job_number , line_item FROM production_data WHERE line_item = '' UNION SELECT p.id , dupes.job_number , dupes.line_item FROM production_data p JOIN ( SELECT job_number , line_item FROM production_data GROUP BY job_number, line_item HAVING COUNT(*) > 1 ) dupes USING (job_number, line_item);
  2. Add them at the end of the loop so you add the data each time, not just the last set after the loop finishes
  3. Wow! So many "Deadly Sins" in a single block of code. Thou shouldst not use global Thou shouldst not run queries inside loops Thou shouldst not split data of the same entity type across multiple tables Thou shouldst not use "SELECT * " Why don't you use fetch _assoc instead of fetch row, then the loop becomes simply while ($rowtime = $result_time->fetch_assoc()) { $temparray[] = $rowtime; }
  4. I guess it's incomplete documentation. DateTime is full of surprises. One I found to be useful is setting a dateInterval to "next weekday" to get a list of N working days $dt = new DateTime('next monday'); $di = DateInterval::createFromDateString('next weekday'); $dp = new DatePeriod($dt, $di, 9); foreach ($dp as $d) { echo $d->format('D M jS').'<br>'; } gives Mon Dec 6th Tue Dec 7th Wed Dec 8th Thu Dec 9th Fri Dec 10th Mon Dec 13th Tue Dec 14th Wed Dec 15th Thu Dec 16th Fri Dec 17th
  5. There is a compromise solution using file() and that is to use str_getcsv() instead of explode(). $data = []; $data_list = ''; $heads = ['Number', 'Name', 'Type', 'List number']; $array = imp_open('Medikamente.csv'); foreach ($array as $line) { $rec = str_getcsv($line); $data[] = array_combine($heads, $rec); } foreach ($data as $rec) { foreach ($rec as $k => $v) { $data_list .= "<label>$k</label> $v<br>"; } $data_list .= "<br>\n"; } function imp_open($pfad) { // Daten auslesen und in der Tabelle speichern $content = file($pfad, FILE_IGNORE_NEW_LINES|FILE_SKIP_EMPTY_LINES); return $content; }
  6. Yes, you could, but your explode above isn't giving the array that you need. You would still have to loop through your array lines and remove the newline from the end of each line explode each individual line trim off the quotes $data = []; $data_list = ''; $heads = ['Number', 'Name', 'Type', 'List number']; $array = imp_open('Medikamente.csv'); foreach ($array as $line) { $rec = explode(',', $line); $rec = array_map(function($v) { return trim($v, ' "'); }, $rec); $data[] = array_combine($heads, $rec); } foreach ($data as $rec) { foreach ($rec as $k => $v) { $data_list .= "<label>$k</label> $v<br>"; } $data_list .= "<br>\n"; } function imp_open($pfad) { // Daten auslesen und in der Tabelle speichern $content = file($pfad, FILE_IGNORE_NEW_LINES|FILE_SKIP_EMPTY_LINES); return $content; } When php provides a function specifically for handling csv data and does all that for you, why not use it?
  7. Then your csv data is not as you posted, with 4 items of data in each row. I used the data you gave Medikamente.csv... 9999900001,"zebra","Noropr","159" 9999900002,"coco1","Noropr","999998" 9999900003,"coco12","Noropr","78" 99999000099999,"coco1123","Noropr","33" 9999900005,"coco198","Noropr","79" 9999900006,"coco111","Noropr","66" 9999900007,"coco1456","NoroprNoropr","2999996" 9999900008,"coco1sss","Salbe","55" 9999900009,"coco90","Salbe","90" 9999900010,"coco111111","Tabletten","102" 9999900011,"coco178989","Noropr","999998" 9999900012,"coco18283838383","Noropr","59" 9999900013,"coco17874738774","Tabletten","899999" 99999000199999,"Tannosynt","Salbe","71" 9999900015,"Vomex A","Noropr","699999" 9999900016,"Vomex A","Noropr","35" ... and my results were etc.
  8. try <?php $data = []; $data_list = ''; $heads = ['Number', 'Name', 'Type', 'List number']; $csv = fopen('Medikamente.csv', 'r'); while ($line = fgetcsv($csv)) { $data[] = array_combine($heads, $line); } fclose($csv); foreach ($data as $rec) { foreach ($rec as $k => $v) { $data_list .= "<label>$k</label> $v<br>"; } $data_list .= "<br>\n"; } ?> <!DOCTYPE html> <html lang="en"> <head> <title>Test</title> <meta charset="utf-8"> <style type='text/css'> label { display: inline-block; width: 120px; background-color: #E0E0E0; color: black; padding: 8px; border: 1px solid white; } </style> </head> <body> <?= $data_list ?> </body> </html>
  9. You could try something like this example <?php if (isset($_GET['ajax'])) { if (rand(0,1)) { exit("Bar Code Incorrect"); } else { exit("Everything OK"); } } ?> <!DOCTYPE html> <html lang="en"> <head> <title>Test</title> <meta charset="utf-8"> <script src="https://code.jquery.com/jquery-3.3.1.min.js"></script> <script type='text/javascript'> $().ready( function() { $("#test").click( function() { $.get( "", {"ajax":1}, function(resp) { if (resp=="Bar Code Incorrect") { let err = $("<div>", {"id":"error", "class":"alert alert-warning", "html":resp}) $("#output").html(err) $("#error").fadeOut(4000) } else { $("#output").html(resp) } } ) }) }) </script> <style type='text/css'> .alert { background-color: red; color: white; padding: 16px; } </style> </head> <body> <button id='test'>Test</button> <div id='output'></div> </body> </html>
  10. Before you go any further you should nomalize those skills and not have a comma-separated list. EG +--------------+ +--------------+ | person | | skill | +--------------+ +--------------+ | person_id PK |--+ +--| skill_id PK | | name | | | | skill_name | | etc | | | +--------------+ +--------------+ | | | +-----------------+ | | | person_skill | | | +-----------------+ | +----<| person_id PK | | | skill_id PK |>--+ | date_achieved | +-----------------+ In the SQL tutorial in my signature there is a similar example application which uses pupils/subects rather than person/skills
  11. Did you you mean something like this? $data = [ 'Hot' => [ 'A' => 5, 'B' => 20, 'C' => 15, 'D' => 10, 'E' => 8 ], 'Dead' => [ 'A' => 15, 'B' => 30, 'F' => 55, 'C' => 40, 'G' => 60, ] ]; echo '<pre> BEFORE ' . print_r($data, 1) . '</pre>'; // original array // // GET ALL KEYS // $keys = []; foreach ($data as $k1 => $v1) { $keys = array_merge($keys, array_keys($v1)); } $keys = array_unique($keys); $blank_values = array_fill_keys($keys, null); // // INSERT MISSING KEYS INTO THE SUBARRAYS // foreach ($data as $k => &$subarray) { $subarray = array_merge($blank_values, $subarray); } echo '<pre> AFTER ' . print_r($data, 1) . '</pre>'; // array after processing which gives... BEFORE Array ( [Hot] => Array ( [A] => 5 [B] => 20 [C] => 15 [D] => 10 [E] => 8 ) [Dead] => Array ( [A] => 15 [B] => 30 [F] => 55 [C] => 40 [G] => 60 ) ) AFTER Array ( [Hot] => Array ( [A] => 5 [B] => 20 [C] => 15 [D] => 10 [E] => 8 [F] => [G] => ) [Dead] => Array ( [A] => 15 [B] => 30 [C] => 40 [D] => [E] => [F] => 55 [G] => 60 ) )
  12. https://www.php.net/manual/en/features.file-upload.post-method.php
  13. Then move all the processing to the same place - one or the other
  14. Erm! @gizmola Note that in the English speaking parts of the world we have "speciality" and not "specialty" 😀
  15. You will need to add GROUP BY sch.id
  16. Where does $row get its values?
  17. Perhaps something like this (my identifiers do not exactly match yours) SELECT h.schoolname as home , a.schoolname as away , sch.preview , GROUP_CONCAT(DISTINCT h.name, ', ', h.ht, ' ', h.pos, '; ', h.grade SEPARATOR ' / ' ) as home_team , GROUP_CONCAT(DISTINCT a.name, ', ', a.ht, ' ', a.pos, '; ', a.grade SEPARATOR ' / ' ) as away_team FROM schedule sch JOIN ( SELECT s.schoolname , concat(p.namefirst, ' ', p.namelast) as name , concat(p.feet, '\'', p.inches, '"') as ht , pos.description as pos , g.description as grade , s.school_id FROM school s JOIN player p ON s.school_id = p.schoolid JOIN grade g USING (grade) JOIN position pos USING (position) ) h ON sch.home_id = h.school_id JOIN ( SELECT s.schoolname , concat(p.namefirst, ' ', p.namelast) as name , concat(p.feet, '\'', p.inches, '"') as ht , pos.description as pos , g.description as grade , s.school_id FROM school s JOIN player p ON s.school_id = p.schoolid JOIN grade g USING (grade) JOIN position pos USING (position) ) a ON sch.away_id = a.school_id PS, output home: School 10 away: School 11 preview: Game on! home_team: Andrew Baker, 6'2" Point Guard; JR / Bernard Cook, 6'5" Shooting Guard; JR / Charles Draper, 6'1" Small Forward; JR / David Potter, 6'4" Power Forward; JR / Eric Fletcher, 6'0" Centre; JR away_team: Frank Gardener, 5'11" Point Guard; JR / Graham Hatter, 6'6" Shooting Guard; JR / Harry Joiner, 6'7" Small Forward; JR / Ian Miller, 6'4" Power Forward; JR / Julie Archer, 5'6" Centre; JR
  18. If it's an array, you need print_r(), not echo
  19. With the two tables I suggested included in your query with joins it would be job done.
  20. Is there any relationship between players age and grade (Senior/Junior)?
  21. I'm guessing that queries with expressions like that above, which contain grade references, will have to be editted every year. If that's true, you really should be looking for another way.
  22. Do you have tables for position and grade? +-----------+-------------------+ +-----------+-------------------+ | position | description | | grade | description | +-----------+-------------------+ +-----------+-------------------+ | 1 | Point guard | | 22 | Senior | | 2 | shooting guard | | 23 | Junior | | etc | | etc | +-----------+-------------------+ +-----------+-------------------+
  23. That line is wrong. It should be either <img class="d-block w-100" src="amin/thumb/<?php echo $row['imgpath']; ?>" alt="First slide"> // or <img class="d-block w-100" src="amin/thumb/<?= $row['imgpath']; ?>" alt="First slide">
  24. Plan B... echo '<pre>' . print_r($line_item->price->metadata->keys(), 1) . '</pre>'; echo '<pre>' . print_r($line_item->price->metadata->values(), 1) . '</pre>';
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.