Jump to content

MySQL_Narb

Members
  • Posts

    250
  • Joined

  • Last visited

Everything posted by MySQL_Narb

  1. Sorry, somehow the HTML I copied in must have gotten cutoff. Even when there is a closing tag, it does not get a match. And I'm actually doing this in JavaScript, and I don't think it has an HTML parser like PHP that I know of. This was the only REGEX specific section I saw, so I just posted it here. And I'll look for the "s" parameter, but I thought that was basically "m"
  2. I need the Regex to capture everything in-between the body tags. Although, it doesn't seem to work. http://regexr.com/38n6r Any ideas
  3. Here is bossScenario: var bossScenario = { 'active': false, 'scenario_ID':0, 'boss': {} }; Here is a list of scenarios: var scenarios = { 1 : { 'func' : function(bossDifficulty){ /*var me = bossScenario['boss'][1]; var vars = me['vars']; vars[rbSoldiers] = bossDifficulty*100000; vars[reward] = bossDifficulty;*/ alert('test'); //randBossBattle(vars); }, 'vars' : { rbSoldiers : 0, reward : 0 } } } In bossScenario, I will assign a scenario to the 'boss' property. And then I will run the below code: bossScenario['boss'][sID]['func'](bossDifficulty); The above code will successfully alert "test"; however, next time I go to run the function, I get this error in the console: Uncaught TypeError: Object #<Object> has no method 'func' It's as if the 'func' property and its function are being removed from the bossScenario function after it is ran. Why is this?
  4. Why doesn't setrawcookie('saved_game', $data[0]['data'], time() + (10 * 365 * 24 * 60 * 60), '/'); Overwrite the existing saved_game cookie? It simply refuses to overwrite the exsting cookie data. The cookie is being set here, in JavaScript: document.cookie = 'saved_game=' + saveCookie + '; expires=Sun, 25 Dec 2020 20:47:11 UTC; path=/';
  5. Wow, you are a life saver! Thank you!
  6. SELECT SUM(`money_earned`),SUM(`army_strength`),SUM(`worker_opm`) FROM `highscores` ORDER BY SUM(`money_earned`) DESC LIMIT $start,$per_page Above is my query pulling data from highscores. In the table highscores, there is a field called "group" to indicate the group they belong to. The purpose of this function is to get the total money of each group by getting the sum of all money from the member's submitted scores, and then ordering by `money_earned` DESC; Is there a way to do this? like get the sums of `money_earned` where all group numbers equal eachother e.g, it would get the total $$$ for everyone of group 4, total $$$ for everyone in group 3, etc
  7. I still couldn't figure out how to make these parallel: foreach($servers as $server){ $ip = explode(':', $server['ip']); $port = (!isset($ip[1]) || empty($ip[1])) ? $port = '43594' : $ip[1]; $ip = $ip[0]; $socket = @fsockopen($ip, $port, $errNo, $errStr, 3); if(!$socket){ //offline }else{ //online } }
  8. I'll have to read back over the manual for fsock. Thanks! I'm not sure cURL is what I need her. The servers I'm making requests to aren't HTTP servers.
  9. I have a site where server owners can advertise their own servers, and I want to display these servers' online/offline statuses. When testing just 3 servers with the fsockopen, it takes a couple seconds. When my site grows, it's going to have 100s of servers. Is there a faster method? I plan on running a cron every 10 minutes to update server statuses.
  10. I have a toplist site that allows users to submit their private game servers to my site. And one of the new features I plan on implementing is online status and uptime percentage. I have planned to make a cron that runs every 10 minutes that will go through each server registered in the database and update the online status (will have to attempt to make a connection to the server). And of course, after checking the status, set the status of the servers on my website to their online/offline results. On top of that, I will need to keep a record of the offline/online statuses every 10 minutes in a MySQL table, in order to calculate the uptime percentage Now, is there a more efficient way to do this? Won't having to make a connection to each server registered in our database be a major resource hog? Here's the stats of my VPS: . Number of CPU's: 1 CPU Speed: 1000MHz Memory: 1GB Disk Space: 10GB Bandwidth limit: 1TB
  11. I'm allowing my users to use a very basic wysiwyg editor to make their posts a bit more fancy; however, I would assume that this gives them the ability to put raw HTML into their posts? So how would I limit the HTML to only what the editor supports (e.g: images, font color, bold, italics, and strike).
  12. Not sure how this is going to help, but O.K. if(!in_array($proxy_split[1], $this->banned_ports)){ $checked[] = $proxy; $this->curl->addSession('http://www.google.com', array( CURLOPT_PROXY => $proxy, CURLOPT_FOLLOWLOCATION => true, CURLOPT_PROXYTYPE => CURLPROXY_HTTP, CURLOPT_TIMEOUT => 120, CURLOPT_USERAGENT => 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:14.0) Gecko/20100101 Firefox/14.0.1', CURLOPT_RETURNTRANSFER => true )); }
  13. I have a script that tests a compiled list of HTTP proxies to see if they can connect to a specified website. If they connect, and the correct page results are returned, they are added to a list of working proxies; however, even if I test about 30,000 proxies at a time....none of them come back working. Yet, when I check a random selection of them in a proxy checker, quite a large portion of them come back working. Even when I specify the proxy type to HTTP, cURL never manages to make a connection to the webpage and return the webpage contents. Note: I am setting a user-agent. As you can see, no results are returned. The contents of the webpage, if any are retrieved, should be posted in the textbox.
  14. I don't really have any knowledge regarding proxies, but can cURL use a sock 4/5 proxy to execute requests? I'm currently using HTTP proxies, and I can never find more when I need them (because usually 1 out 1000 usually connect to my specified sites).
  15. It seems as if the cURL option for the PROXY just doesn't work at all. I tried this with just a regular setup. <?php $ch = curl_init('http://www.google.com'); curl_setopt($ch, CURLOPT_PROXY, '123.103.23.106:20400'); curl_setopt($ch, CURLOPT_TIMEOUT, 20); curl_exec($ch); print_r(curl_getinfo($ch)); curl_close($ch); ?> And it still returned nothing.
  16. Sorry, usually soon as I post my problems I come up with a fix afterwards (should probably wait longer, but yeah). I'm using the built in error function in the CURL class. Added this bit: $errors = $this->curl->error(); print_r($errors); And that was my interesting result. There's no way these are all failing; I just double checked in a proxy checking problem...and quite a bit returned working.
  17. (note: cURL is installed on my webhost) Useful cURL class I'm using: http://paste2.org/3weaghkm public function testNewProxies($file, $max_time = 5){ $proxies = file_get_contents($file); $proxies = explode(PHP_EOL, $proxies); $fresh = array(); foreach($proxies as $proxy){ $this->curl->addSession('http://www.google.com', array( CURLOPT_TIMEOUT => 10, CURLOPT_RETURNTRANSFER => true, CURLOPT_PROXY => $proxy )); } $results = $this->curl->exec(); $this->curl->clear(); print_r($results); //check for fresh proxies for($i = 0; $i < count($proxies); $i++){ if(strpos($results[$i], 'removed')) $fresh[] = $proxies[$i]; } //write fresh proxies to new file $fh = fopen('proxies/proxies_checked_'.time().'.txt', 'w+'); foreach($fresh as $proxy){ fwrite($fh, $proxy.PHP_EOL); } fclose($fh); return count($fresh); } The above function will run a file full of proxies and test them to see if they're good for use; however, nothing is ever returned. No errors or any data at all is returned in $results. I always get this: or Any ideas and/or solutions? Thank you.
  18. EDIT: The below problem seems to be regarding XAMPP. I tried this code on my webhost, and it worked just fine. Weird. Because this forum has an annoying policy regarding editing your posts, I guess I'll make another post. I'm using the following cURL class: http://paste2.org/4kZz3pUg Whenever I run one session, cURL works fine and the requested results load instantly. But if I add more than one (even if it's 2) session, my page never loads. The favicon is just the chrome loading symbol. So this loads instantly/works: <?php class test { private $url = 'http://www.google.com'; private $curl; function __construct(CURL $curl) { $this->curl = $curl; $opts = array( CURLOPT_RETURNTRANSFER => true, CURLOPT_FOLLOWLOCATION => true ); $curl->addSession($this->url, $opts); $results = '<textarea>'.$curl->exec().'</textarea>'; $curl->close(); echo $results; } } ?> This does not work. The page never loads: <?php class test { private $url = 'http://www.google.com'; private $curl; function __construct(CURL $curl) { $this->curl = $curl; $opts = array( CURLOPT_RETURNTRANSFER => true, CURLOPT_FOLLOWLOCATION => true ); $curl->addSession($this->url, $opts); $curl->addSession($this->url, $opts); $results = '<textarea>'.$curl->exec().'</textarea>'; $curl->close(); echo $results; } } ?> I'm running XAMPP
  19. From the PHP manual page, it looks like you can only set the cURL options for all the handles you're including once. So it's like the multiple cURL sessions will only do what you told them to before you added them to the multi_curl handle. But what if I needed EVERY (I want them all to do the same thing!) handle I added to the multi_handle to execute more than one page? If the above isn't put into words properly, how do you go about doing something like this in multi_curl? curl_setopt($currenthandle, CURLOPT_URL, 'http://www.google.com'); curl_exec($currenthandle); curl_setopt($currenthandle, CURLOPT_URL, 'http://www.pageineedtopostto.com'); curl_setopt($currenthandle, CURLOPT_POST, true); curl_setopt($currenthandle, CURLOPT_POSTFIELDS, array('param' => 'value'); curl_exec($currenthandle);
  20. $page = curl_exec($ch); if(curl_errno($ch) != 0){ return false; } curl_setopt($ch, CURLOPT_URL, ''); $page = curl_exec($ch); curl_close($ch);
  21. curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); $page = curl_exec($ch); As you can see, I want CURL to follow any redirects a website throws at me. And I want $page to contain the HTML of the last page loaded; however, $page seems to give me the value of just the very first page and none of the following redirects. Is there anyway I can go about this differently to solve my problem?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.