Jump to content

iarp

Members
  • Posts

    326
  • Joined

  • Last visited

Everything posted by iarp

  1. I've given up on PharData. It hasn't been a good experience, I'm back to using exec and tar czf and checking $results !== 0 for errors. Issues so far were 100 filename character limitation, the tmp folder piling up during script execution, as well as other annoyances I don't think I really care for anymore. # Compression exec("tar czf $tgz_file -C $source .", $output, $results); # Decompression exec("tar xzf $tgz_file -C $output_folder", $output, $results); Thanks for your help.
  2. It would have to store it somewhere, memory isn't a good location when it could be a very large archive by the time it's done. I think I'm going to have to find another way without PharData. It's not the compression it's PharData itself. If I remove the compress call, it's still creating and holding onto the file in tmp. Originally I started by calling exec to 'tar czvf $tgz_file $source' but I couldn't properly get the status or any error messages back. One error I'm constantly fighting is file paths that are too long. Some of these archive are failing because some file paths are 100+ characters which PharData craps out at 101+. I've temporarily modified my script so that it exec's back onto itself and through argv it receives the folder that it needs to work on and does it thing, then when that script finishes since it was a standalone php call it clears tmp.
  3. That's what I'm trying to figure out, unlinkArchive only gets rid of the first .tar that it creates. Both unset($a) and $a = null; do nothing.
  4. I have the following code that creates a tar file and then compresses it into a .tgz file. $a = new PharData($tar_file); $a->buildFromDirectory($source); $a->compress(Phar::GZ, 'tgz'); @unlink($tar_file); The script backups a number of different folders depending on time, day..etc. I can understand why PharData is creating the temporary file in /tmp/ and they all are named things like ./php8VMsXV ./phpaZsDDn and they are slightly larger than the uncompressed .tar file, so I'm guessing that PharData is using that as a temporary unnamed tar file, copies the contents into a properly named tar file and then compresses the .tar into the .tgz as I've specified above. The issue is that PharData doesn't delete those /tmp/ files until the script finishes, it leaves them all in /tmp/ and then deletes at the end of script execution. Sometimes I have 35 directories that need to be archived and then uploaded to amazon. This is killing hard drive space on the cloud machines we have. We may have 20gb free but having 35 tmp files takes up the entire hard drive space and then all the scripts start failing. I've tried unset($a) just after the compress and a couple other things that I've forgotten now. I can only find one thing online that seems to mention a potential issue with it, https://bugs.php.net/bug.php?id=70417 Any ideas?
  5. Nowhere in there are you checking to see if nothing was submitted. Typically you'd want to wrap the form processing code in something like if (isset($_POST['submit'])) Which will check to see if the submit button was clicked, at which point you know it's time to process the form. Otherwise it means someone landed on the page somehow and didn't mean to, so there's no reason to run the form processing.
  6. All you need to do is modify the time given within the setCookie( call. You see it currently says 24*60*60*1000.
  7. Your issue is within function checkTarget(e). If you really want it to popup that often, remove the if statement wrapping the popup code. You could also get away with just commenting out the setCookie line.
  8. What exactly are you trying to accomplish that requires processing after the redirection? As long as no other headers or data has been sent yet that header will kill your script instantly and send the user on their way.
  9. You might want to have a look at what the following code outputs, I highly doubt it's doing what you think it's doing. echo stripslashes($_GET['source_file']);
  10. Have you had a look at the browsers console to see if any javascript errors are occurring? In chrome F12 and go to the Console tab.
  11. $sql .= " LIMIT $start, " . NUMBER_PER_PAGE; Where does $start come from and what does it contain? Where does NUMBER_PER_PAGE come from and what does it contain? The first option for LIMIT is the where to offset the results, the second is the limit of how many to return.
  12. You'll want to have a look at PayPal's SDK for PHP. https://github.com/paypal/PayPal-PHP-SDK
  13. You need to re-read Ch0cu3r's post. It answers all of the issues you're having. You're calling new Game twice in manage-games.php. You need to fix $db into $this->db
  14. Do you have control of the sending side, whereever $url is pointing to? Can you give us the output you see from echo $results;
  15. You need to create a unique constraint on all 3 columns at the same time. That would prevent you from adding a row such as userid=1, day=2, meal=1 twice. It would throw a constraint type error if you tried to. http://stackoverflow.com/a/635943 mysql> ALTER TABLE bradba ADD UNIQUE `uq_userdatemeal_ids`(`userid`, `dateid`, `mealid`); Query OK, 0 rows affected (0.25 sec) Records: 0 Duplicates: 0 Warnings: 0 mysql> INSERT INTO bradba (userid, dateid, mealid) VALUES (1, 2, 1); Query OK, 1 row affected (0.03 sec) mysql> INSERT INTO bradba (userid, dateid, mealid) VALUES (1, 2, 1); ERROR 1062 (23000): Duplicate entry '1-2-1' for key 'uq_userdatemeal_ids' mysql> INSERT INTO bradba (userid, dateid, mealid) VALUES (1, 2, 2); Query OK, 1 row affected (0.05 sec)
  16. Perhaps this? if (!function_exists('is_json')) { function is_json($json) { $json = json_decode($json); if (is_array($json)) { return True; } return False; } }
  17. iarp

    Help Please

    You'll need to post the php script that you're filling out so we can see if they have coded in any restrictions.
  18. What version of PHP are you running? json_last_error() was added in 5.3.0. If you create a test.php with the contents below, view the page and it'll tell you what version you're running. <?php phpinfo();
  19. I decided to try running SQL Server Profiler to figure out if maybe SQL was taking forever to issue the query. So I'm sitting here, Profiler on one screen and my test script on another. I refresh my test script and see the connection open within Profiler. It sits there for 50 seconds exactly, then I see the query gets issued to SQL, executed, 3 seconds later it returns all data.
  20. Hi All, I am at a loss. I have a web server that used to run beautifully (IIS/PHP) and something has just caused it to tank. I can run queries directly against the SQL Server database and get 700+ row results in 3 seconds. However when I connect via PDO, ->prepare the query and then ->execute the execution can take 50+ seconds to run. Even just a ->query takes the same length of time. The query I am issue has no parameters, the values are hardcoded and controlled by me. I'm at a loss as to what to look into next, nothing I do changes the time factor. I've tried turning off emulating prepares, but prepares happen instantly. I've been tracking how long it takes and this is what my log looks like START|2015-09-23 21:48:50 PREPARED|2015-09-23 21:48:50 EXECUTED|2015-09-23 21:49:43 FETCHED|2015-09-23 21:49:44 SENT|2015-09-23 21:49:44
  21. Looking at the code you provided, what you linked to and this page http://docs.whmcs.com/Hooks I think I know what you're wanting. You need to move the code you have within {php}{/php} into a function and then using add_hook define when that function should be called so that you may access it's information in the template file. function hook_template_variables_example($vars) { if( $qrystr = strpos( $_SERVER['REQUEST_URI'], '?' ) ) $url = substr( $_SERVER['REQUEST_URI'], 0, $qrystr ); elseif( $qrystr = strpos( $_SERVER['REQUEST_URI'], '#' ) ) $url = substr( $_SERVER['REQUEST_URI'], 0, $qrystr ); else $url = $_SERVER['REQUEST_URI']; $prefix = "Title"; $default = "$prefix - blah blah blah"; $extraTemplateVariables = array( 'title_url' = $default, 'title_root' = $default, 'title_index' = $default, 'title_affiliates' = $prefix . " - Affiliates", ); return $extraTemplateVariables; } add_hook('ClientAreaPageViewTicket', 1, 'hook_template_variables_example'); That is what I came up with using the code you provided. It is attaching itself to the Hook Point ClientAreaPageViewTicket which according to that page says Runs when the View Ticket page is displayed. From within the template file itself, because the hook is called and it returned that array of variables, I can now access the data by calling {$title_url} or {$title_root} or {$title_index} or {$title_affiliates} None of this is tested, I don't have whmcs and I'm just going on what the documentation leads me to believe.
  22. The function you linked to via gravitywiz seems to be a function used to validate one form's field against another, not as a lookup system using CSV. If anything I would expect you to have to use a CSV import utility to put the RSVP numbers into the backend and then use the function to compare two forms. I don't know gravity forms at all, I'm just going based off what you posted above and the page you linked to.
  23. I believe you're getting that because the request is either too large or taking too long for the receiving server and it is timing out before your script finishes sending data. I would try breaking it into more than one request.
  24. Ah ha, you could make good use of transactions for this issue. I have taken mac_gyver's suggestion of using $errors along with using PDO's transaction system to keep track of errors and choosing whether or not to commit to the database or rollBack. Making use of $errors allows you to collect all possible errors that occurred with the form, display that to the user so that they could potentially fix all problems with a single form submission. You're original script would mean they would continuously have to resubmit the form checking every field individually which is very annoying if you run into it in real life. <?php if (isset($_POST['submit']) { $record_name = trim($_POST['record_name']); $record_details = trim($_POST['record_details']); $record_type = trim($_POST['record_type']); $date = date('Y-m-d H:i:s'); $errors = array(); if (empty($record_name)) { $errors[] = 'record name is required!'; } if (empty($record_details)) { $errors[] = 'record details are required!'; } if (empty($record_type)) { $errors[] = 'Type is required!'; } $db->beginTransaction(); $insert_record = $db->prepare("INSERT INTO records(record_name, record_type, record_details, date) VALUES(:record_name, :record_type, :record_details, :date)"); $insert_record->bindParam(':record_name', $record_name); $insert_record->bindParam(':record_type', $record_type); $insert_record->bindParam(':record_details', $record_details ); $insert_record->bindParam(':date', $date); if (!$insert_record->execute()) { $errors[] = 'record not added!'; } $record_id = $db->lastInsertId(); if (!empty($_FILES['files'])) { $userdir = $_SERVER['DOCUMENT_ROOT'] . '/comp/images/'; if (!is_dir($userdir)){ mkdir($userdir, 0775, true); } foreach($_FILES['files']['tmp_name'] as $key => $tmp_name ){ $file_path = $userdir . $tmp_name; if (is_uploaded_file($_FILES['files']['tmp_name'][$key])) { if (in_array($type, $allowed)) { try { $insert_image = $db->prepare("INSERT INTO images(record_id, image_path, date) VALUES(:record_id, :image_path, :date)"); $insert_image->bindParam(':record_id', $record_id); $insert_image->bindParam(':image_path', $file_path); $insert_image->bindParam(':date', $date); if (!$insert_image->execute()) { $errors[] = 'There was a problem!'; } else { move_uploaded_file($tmp_name, $file_path); echo 'Your image has been saved.'; } } catch(Exception $e) { $error[] = die($e->getMessage()); } } else { $errors[] = 'You have uploaded a forbidden extension!'; } } else { $errors[] = 'File is empty!'; } } } else { $errors[] = 'No files supplied'; } # If the errors array is empty then commit all of the changes we've made above, to the database. # Otherwise rollback all changes/inserts and undo everything we just did. if (empty($errors)) { $db->commit(); } else { $db->rollBack(); } }
  25. Just to clarify something, you're inserting 1 record and multiple images correct? If so, you would insert into the records table first to obtain the record id and then insert the multiple images using the for loop. If my assumption is correct then your $insert_record section should be moved to before the "if(isset($_FILES['files'])){" part. $record_id would come from the id of the record that was inserted using $insert_record-lastInsertId() after the $insert_record->execute() statement.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.