Jump to content

joeshanley

New Members
  • Posts

    2
  • Joined

  • Last visited

    Never

Everything posted by joeshanley

  1. I've just created this. I think this will be quicker to store and then read the CSV data. function load_csv($file,$identifier,$commission,$purchase_value) { $lines = file($file) or die('Could not open file'); foreach($lines as $line) { $row_data = explode(",", $line); $new_row[0] = $row_data[$identifier]; $new_row[1] = $row_data[$commission]; $new_row[2] = $row_data[$purchase_value]; $transaction_data[] = $new_row; } return $transaction_data; } Is there any better way than this? Or can this code be easily improved? Help appreciated Thanks.
  2. Hi, I am updating a few parts of my current website. Currently, the website updates user accounts by uploading CSV data reports. I currently use fgetcsv to deal with each row and this is taking forever to deal with the data as this is currently the code for each row. while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) { $grabinfo = "SELECT * FROM transactions WHERE id = '$data[$i]'"; $grabinforesult = mysql_query($grabinfo) or die(mysql_error()); while($row = mysql_fetch_array($grabinforesult)){ $merchant = $row['merchant']; $username = $row['username']; $datetime = $row['datetime']; $letthem = $row['letthem']; $validate = "SELECT * FROM transactions WHERE merchant = '$merchant' and username = '$username' and (status='pending' or status='success' or status='confirmed') and datetime='$datetime'"; $validator = mysql_query($validate) or die(mysql_error()); if(mysql_num_rows($validator) >= $letthem) { $import="UPDATE transactions SET amount='0.00' WHERE id='$data[$i]' and status ='awaiting'"; mysql_query($import) or die(mysql_error()); }} mysql_query($import) or die(mysql_error()); } fclose($handle); This is performing lots of checks on the csv data and checking it with the system. It checks each row. Is there a quicker way of reading the csv data? There are lots of checks and while loops within the fgetcsv while loop. It can take upto half an hour to process a file with 1000 csv rows. Hope someone could offer some advise. Thanks
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.