Jump to content

Klyx99

Members
  • Content Count

    28
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Klyx99

  • Rank
    Member
  1. OK, here is the source (for anyone interested) I got the online games to use the player count tables, thus eliminating that horrible script in previous post LOL. Much faster. JOIN was tricky to learn. but works here prefectly. $sql = " SELECT * FROM onlinegames AS og INNER JOIN playercount AS pc ON og.gameid = pc.gameid WHERE gamename LIKE concat('%', :needle, '%') ORDER BY pc.playercount DESC ";
  2. OK, so Joining works (makes my prev post and example looks trashy LOL) meh, learning process - thanks for the Joining tip. Keep in mind, IM old school from the 80's, so qb, gb, pascal basic and vb1-5. Recently last several years, chose php/mysql as my fav language 🙂 But its a hobby an love learning it
  3. in that particular script, 1st to load all the player counts $onlinecounts then loop through that array using the gameid and loop through all theonline games updating the player count form the online counts. I didnt think about JOINING. let me try that. Agreed on the connect(), I am still cleaning up the code - that is on my list 🙂
  4. Well this is goofy. I realized I need to add a player count (which is from another feed) to the online games. I currently store the gameid and player count in another table. the player count from that table should be added to the onlinetable.playercount this script taks over 3 - 5min vs the 30seconds for the other 7 tables. Any ideas what I could do to speed this up. Normally this is not an issue, but it is during a search as I want to use a player count to sort it by (note player count is NOT in the online games feed, it is in a separate feed for some weird reason, I say weird because only the online games use the player count) $sql = "SELECT * FROM playercount"; $stmt = $db->connect()->query($sql); $stmt->execute(); $onlineCount = $stmt->fetchAll(PDO::FETCH_ASSOC); foreach($onlineCount as $onlinegame) { $online_update = "UPDATE onlinegames SET playercount = :playercount WHERE gameid = :gameid"; $check = $db->connect()->prepare($online_update); $check->execute($onlinegame); }
  5. ok, CRON done.7 feeds into 7 tables 43 seconds (missing one table I will add later for Mac - can't leave them out - they're worse than linux users when it comes to games LOL) Incredible - and here I thought I got it down to 9 min and I was rockin!!! LOL
  6. omg I am so dumb I forgot I could do all this in php der!! LOL I found a similar one from a few of my old sites (2010)using curl - prety fast on the download, but bottle necked with the old way of processing. Now I'll put it todgether with the LOAD command. minor change in my maint job
  7. Ya I think that is the way to go. with that speed I can now do all 9+ rss feeds that pertain to the overall collection. My test showed about 5min 8 tables, where before it was around 15min top.. HUGE diff, as when I put it on my server (little more beefier) I am expecting all the tables to process around 1-3min tops. That was a very nice tip! Currently making a batchfile to run it all (cept having issues with wget grabbing the xml file) but that is for a different thread. I'll figure it out LOL
  8. yes that is exactly what I did. Very surprised at how fast that processed it. Im at work, so using the old tried and true Thinkpad w530 (20gbRAM) lol and both times took roughly 9 - 15seconds. So thinking outside the box, Instead of using cron.php to run a long and grueling import, I'll write a batch file (will post it here for others) to process all the data. Thanks for that tip!! Was extremely helpful!!! Not sure if there is a SOLVED button, but if not, i'll edit the title
  9. WELL I just tried the LOAD comand - and BAM 9 seconds on my OLD THINK PAD!!!! Think that might be the way to go! EDIT: forgot to mention i did that at the cmd prompt, but that would be an easy batch file to make and run at midnight
  10. Been an affiliate for 9yrs, last few years been tinkering with creating my own site vs the ugly stock one they have.. Great learning process.
  11. sure! Here you go, there are about 5,000 others of them but felt one was enough LOL <gamexml> <title>Big Fish Games - Game Archive</title> <description>A New Game Every Day</description> <copyright>© 2018, Big Fish Games, Inc.</copyright> <pubdate>Fri, 16 Nov 2018 05:15:32 -0800</pubdate> <game> <gameid>17915</gameid> <gamename>1 Moment of Time: Silentville</gamename> <family>1 Moment of Time: Silentville</family> <familyid>7237</familyid> <productid>13231</productid> <genreid name="Puzzle">4</genreid> <allgenreid>4,15,21,27,119,123,124,125,126,143,147</allgenreid> <shortdesc>Uncover a mystery and save a town from a curse.</shortdesc> <meddesc>Once upon a time in the small, undistinguished town of Silentville people started disappearing. Help uncover the mystery behind these disappearances and save the town from a curse!</meddesc> <longdesc>Uncover a mystery and save a town from a curse in 1 Moment Of Time: Silentville! Once upon a time in the small, undistinguished town of Silentville people started disappearing - one by one. Even the visitors suddenly disappeared without a trace. Airplanes and birds, trains and pedestrians, drivers, tourists and even bicyclers - all of them have vanished! Will you be able to solve the mystery and release Silentville from its cursed non-existence?</longdesc> <bullet1>Beautiful graphics</bullet1> <bullet2>Original mini-games</bullet2> <bullet3>Intriguing story</bullet3> <bullet4>Check out our Blog Walkthrough</bullet4> <foldername>en_1-moment-of-time-silentville</foldername> <price>$ 6.99 USD</price> <hasdownload>yes</hasdownload> <macgameid>27210</macgameid> <hasvideo>yes</hasvideo> <hasflash>no</hasflash> <hasdwfeature>no</hasdwfeature> <gamerank>2401</gamerank> <releasedate>2012-06-17 00:00:00</releasedate> <gamesize>528691151</gamesize> <systemreq> <pc> <sysreqos>Windows XP/Windows Vista/Windows 7/8</sysreqos> <sysreqmhz>1.8 GHz</sysreqmhz> <sysreqmem>512</sysreqmem> <sysreqdx>8.1</sysreqdx> <sysreqhd>518</sysreqhd> </pc> </systemreq> </game>
  12. Im still in the process of learning PDO, so my question then would be: The reason I have the repeated preparing, is because the XML is dynamic. will preparing ONCE account for the changes? Can you give an example of the repeated code and how you would improve it? That code posted is very old, and since it works, I have never changed it. I use a combination of 2 XML functions to read that file since it is fairly large $xml = new XMLReader(); $xml->open($xml_file); // file is your xml file you want to parse while($xml->read() && $xml->name != 'game') { ; } // get past the header to your first record (game in my case) Unless you are referring to this section: I just loop through the fields and create the prepare statement. If this is the section you are referring to, if I put this outside of the loop, will the execute command function as needed? Maybe this is my bottle neck. I am still new to the prepare and how they respond to an execute command. But if this is the bottle nicek, and the execute will work fine, I could grab the field names and create this string OUTSIDE of the while loop and then run it, might speed it up? /* this looks confusing, but it is not. There are over 20 fields, and instead of typing them all out, I just made a string. */ $sql = "INSERT INTO $table ("; foreach($gameRec as $field=>$game){ $sql .= " $field,"; } $sql = rtrim($sql, ","); $sql .=") values ("; foreach($gameRec as $field=>$game) { $sql .= " :$field,"; } $sql = rtrim($sql,","); $sql .= ") ON DUPLICATE KEY UPDATE "; // online game doesn't have a gamerank - not my choice LOL, so I adjust that for here switch ($os) { case 'pc' : $sql .= "gamerank = ".$gameRec['gamerank'] ; break; case 'mac': $sql .= "gamerank = ".$gameRec['gamerank'] ; break; case 'pl' : $sql .= "playercount = ".$gameRec['playercount'] ; break; case 'og' : $playercount = $this->getPlayerCount($gameRec['gameid']); $sql .= "playercount = ".$playercount['playercount'] ; break; } Also, MY POST is under JAKE STONE. I am not the OP there. Just in case you read a different post. thanks for checking that out.
  13. hmm I might write a quick script to test that - the xml file is up to 5k records. The bottleneck comes in when migrating the data to the sql. That's why I just do the following to import it. Fairly fast, but not as fast as if I was reading it from xml straight to the page. I post that yesterday for someone looking for a way to just import XML to SQL. What I might tinker with is a way to pull a little and cache it, but that could be problematic as well with a high level of users. what do you think?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.