Jump to content

iarp

Members
  • Content Count

    326
  • Joined

  • Last visited

Community Reputation

4 Neutral

About iarp

  • Rank
    Advanced Member

Profile Information

  • Gender
    Male
  1. I've given up on PharData. It hasn't been a good experience, I'm back to using exec and tar czf and checking $results !== 0 for errors. Issues so far were 100 filename character limitation, the tmp folder piling up during script execution, as well as other annoyances I don't think I really care for anymore. # Compression exec("tar czf $tgz_file -C $source .", $output, $results); # Decompression exec("tar xzf $tgz_file -C $output_folder", $output, $results); Thanks for your help.
  2. It would have to store it somewhere, memory isn't a good location when it could be a very large archive by the time it's done. I think I'm going to have to find another way without PharData. It's not the compression it's PharData itself. If I remove the compress call, it's still creating and holding onto the file in tmp. Originally I started by calling exec to 'tar czvf $tgz_file $source' but I couldn't properly get the status or any error messages back. One error I'm constantly fighting is file paths that are too long. Some of these archive are failing because some file paths are 100+ characters which PharData craps out at 101+. I've temporarily modified my script so that it exec's back onto itself and through argv it receives the folder that it needs to work on and does it thing, then when that script finishes since it was a standalone php call it clears tmp.
  3. That's what I'm trying to figure out, unlinkArchive only gets rid of the first .tar that it creates. Both unset($a) and $a = null; do nothing.
  4. I have the following code that creates a tar file and then compresses it into a .tgz file. $a = new PharData($tar_file); $a->buildFromDirectory($source); $a->compress(Phar::GZ, 'tgz'); @unlink($tar_file); The script backups a number of different folders depending on time, day..etc. I can understand why PharData is creating the temporary file in /tmp/ and they all are named things like ./php8VMsXV ./phpaZsDDn and they are slightly larger than the uncompressed .tar file, so I'm guessing that PharData is using that as a temporary unnamed tar file, copies the contents into a properly named tar file and then compresses the .tar into the .tgz as I've specified above. The issue is that PharData doesn't delete those /tmp/ files until the script finishes, it leaves them all in /tmp/ and then deletes at the end of script execution. Sometimes I have 35 directories that need to be archived and then uploaded to amazon. This is killing hard drive space on the cloud machines we have. We may have 20gb free but having 35 tmp files takes up the entire hard drive space and then all the scripts start failing. I've tried unset($a) just after the compress and a couple other things that I've forgotten now. I can only find one thing online that seems to mention a potential issue with it, https://bugs.php.net/bug.php?id=70417 Any ideas?
  5. Nowhere in there are you checking to see if nothing was submitted. Typically you'd want to wrap the form processing code in something like if (isset($_POST['submit'])) Which will check to see if the submit button was clicked, at which point you know it's time to process the form. Otherwise it means someone landed on the page somehow and didn't mean to, so there's no reason to run the form processing.
  6. All you need to do is modify the time given within the setCookie( call. You see it currently says 24*60*60*1000.
  7. Your issue is within function checkTarget(e). If you really want it to popup that often, remove the if statement wrapping the popup code. You could also get away with just commenting out the setCookie line.
  8. What exactly are you trying to accomplish that requires processing after the redirection? As long as no other headers or data has been sent yet that header will kill your script instantly and send the user on their way.
  9. You might want to have a look at what the following code outputs, I highly doubt it's doing what you think it's doing. echo stripslashes($_GET['source_file']);
  10. Have you had a look at the browsers console to see if any javascript errors are occurring? In chrome F12 and go to the Console tab.
  11. $sql .= " LIMIT $start, " . NUMBER_PER_PAGE; Where does $start come from and what does it contain? Where does NUMBER_PER_PAGE come from and what does it contain? The first option for LIMIT is the where to offset the results, the second is the limit of how many to return.
  12. You'll want to have a look at PayPal's SDK for PHP. https://github.com/paypal/PayPal-PHP-SDK
  13. You need to re-read Ch0cu3r's post. It answers all of the issues you're having. You're calling new Game twice in manage-games.php. You need to fix $db into $this->db
  14. Do you have control of the sending side, whereever $url is pointing to? Can you give us the output you see from echo $results;
  15. You need to create a unique constraint on all 3 columns at the same time. That would prevent you from adding a row such as userid=1, day=2, meal=1 twice. It would throw a constraint type error if you tried to. http://stackoverflow.com/a/635943 mysql> ALTER TABLE bradba ADD UNIQUE `uq_userdatemeal_ids`(`userid`, `dateid`, `mealid`); Query OK, 0 rows affected (0.25 sec) Records: 0 Duplicates: 0 Warnings: 0 mysql> INSERT INTO bradba (userid, dateid, mealid) VALUES (1, 2, 1); Query OK, 1 row affected (0.03 sec) mysql> INSERT INTO bradba (userid, dateid, mealid) VALUES (1, 2, 1); ERROR 1062 (23000): Duplicate entry '1-2-1' for key 'uq_userdatemeal_ids' mysql> INSERT INTO bradba (userid, dateid, mealid) VALUES (1, 2, 2); Query OK, 1 row affected (0.05 sec)
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.