slamMan Posted October 23, 2009 Share Posted October 23, 2009 I have a problem with a piece of code I wrote to import some records from a csv file into mysql. I have a .csv file with about 300 records in it. The code worked perfect while testing on 3 or 4 records. When I tried to import the entire list I ran into problems where there were gaps of records that had no data. I tested some of the records that were being missed thinking that it was a charachter in the record making sql choke, but when I did blocks of 5 or 10 of the bad records they worked. This is the error message: Warning: array_combine() [function.array-combine]: Both parameters should have an equal number of elements in At some point the insert starts inserting records again and then it hits another block of errors. I am thinking this has to do with some kind of memory limit but the csv file is only 66kb. Any ideas? Here is the code $handle = fopen($_FILES['filename']['tmp_name'], "r"); if ($handle) { set_time_limit(0); //the top line is the field names $fields = fgetcsv($handle, 4096, ','); //loop through one row at a time while (($data = fgetcsv($handle, 4096, ',')) !== FALSE) { $data = array_combine($fields, $data); // Create a Joomla Password and insert the un-encrypted and encryted+salt value into array $password = generatePassword($length=9, $strength=4); $data[password] = $password ; require_once 'libraries/joomla/user/helper.php'; //Use Joomla function in order to salt the password and encrypt $salt = JUserHelper::genRandomPassword(32); $crypt = JUserHelper::getCryptedPassword($password, $salt); $md5password = $crypt . ':' . $salt; $data[md5password] = $md5password; // Insert Customer info into jos_user table $jos_users = sprintf("INSERT into jos_users_test(name,username,email,password,usertype,sendEmail,gid,registerDate) values('%s','%s','%s','%s','%s','%d','%d',NOW())", sql_prep($data[Contact], $db), sql_prep($data[Customer], $db), $data[Email], $data[md5password], 'Registered', '1', '18'); mysql_query($jos_users) or die(mysql_error()); Here is the output of the array, you can tell [15] doesn't have any of the data that [14]and all the rest have. Then it will pick up at some point and a bunch will have data and then it will choke again. [14] => Array ( [Customer] => XXXXXXXXX [Contact] => [Phone] => XXXXXXXXXXXX [Fax] => [Alt. Phone] => [Alt. Contact] => => tXXXXXX@yahoo.com [bill to 1] => XXXXX [bill to 2] => XXXXXXXXX [bill to 3] => XXXXXXXX [bill to 4] => [bill to 5] => [ship to 1] => XXXXXXX [ship to 2] => XXXXXXXXX [ship to 3] => XXXXXXXX [ship to 4] => [ship to 5] => [password] => password [md5password] => 461543387f04a4848d38a6d6fd7376cc:0GHs3PRXbdsqbeghoiZnKxJeudwAYtv3 ) [15] => Array ( [password] => password [md5password] => d2db7721b7df8f151ef342a3a72aa32d:bwGLFCJnCdDTjDZH2b5kqfIL5QJ3z42h ) ) Quote Link to comment Share on other sites More sharing options...
cags Posted October 24, 2009 Share Posted October 24, 2009 How did you create that output? Assumably the way your csv is laid out, the first row contains item names and all remain rows contains data? By the looks of it your first line must contain... Customer,Contact,Phone,Fax,Alt. Phone,Bill to 3,Bill to 4,Bill to 5,Ship to 1,Ship to 2,Ship to 3,Ship to 4,Ship to 5,password,md5password That's 15 fields in total, you say you are getting the error when there is a gap in the field, can you give an example of what a line with gaps looks like? Quote Link to comment Share on other sites More sharing options...
slamMan Posted October 24, 2009 Author Share Posted October 24, 2009 Its not that the line of csv has gaps, although some lines are missing a couple of fields the lines in question are really no different from the others, in fact I have mixed up the lines and the problem occurs at the same spot in the insert regardless of which order the lines are in. The gaps I speak of are in the sql insert and show up in the $data variable after the file is read into it. EX. My csv file has 300 records, all appear to be exactly the same. at about record 16 it starts inserting blank records where records should be other than the 2 variables that I am inserting manually $data[password] and $data[md5password] which show up in both the sql insert and the print_r output or $data. Then at about record 30 or so it starts inserting full records again having skipped all the records in between. Then it will repeat this further into the query several times. All in all I lose around 100 records. It seems like when the file is looping some rows are being dropped and then I get the error regarding array_combine() due to the missing rows. Could this be due to memory on such a small file? I am doing this on a xampp instance on my local machine for testing. And yes the first line does contain what you have there. Every row contains info in most fields but not all. No row contains info in more fields. Quote Link to comment Share on other sites More sharing options...
cags Posted October 24, 2009 Share Posted October 24, 2009 I highly doubt it could be a memory issue with such a small file. The error message you get will happen if the array returned by fgetcsv does not contain the same amount of 'column'/'items' as $field does. To my knowledge the only thing likely to cause that is there being a differnt amount of unescaped commas on the line in question. Quote Link to comment Share on other sites More sharing options...
slamMan Posted October 24, 2009 Author Share Posted October 24, 2009 I will look back through the rows to see about commas. The data is being run through this function: function sql_prep($var) { return mysql_real_escape_string($var); } Which I thought would handle that problem. Quote Link to comment Share on other sites More sharing options...
cags Posted October 24, 2009 Share Posted October 24, 2009 The error you posted is being thrown well before that code ever runs... // fetch the column names $fields = fgetcsv($handle, 4096, ','); // fetch an array for each line in turn while (($data = fgetcsv($handle, 4096, ',')) !== FALSE) { // merge the two arrays, using $fields as keys and $data as values $data = array_combine($fields, $data); So if $data holds a differen't amount of elements to $fields, it will error. Quote Link to comment Share on other sites More sharing options...
DavidAM Posted October 24, 2009 Share Posted October 24, 2009 Is it possible that one of the lines is more than 4096 characters long and so you only get part of it, then, for a while, you are getting data from two partial rows? Is it possible that one of the lines does not have matching opening and closing quotes in one or more fields? Try adding this just before the array_combine and see what you get: if (count($data) != count($fields)) { echo 'Incorrect column count in record: '; print_r($data); exit; } what output do you get? Quote Link to comment Share on other sites More sharing options...
slamMan Posted October 25, 2009 Author Share Posted October 25, 2009 As I take a closer look at the data it may be that extra commas in the data is messing things up. I am going to try it with a tab deliminated file. Quote Link to comment Share on other sites More sharing options...
slamMan Posted October 25, 2009 Author Share Posted October 25, 2009 OK. That didn't work. The tab delim file didn't import very well at all. Back to the drawing board. I took out the bill to and ship to fields which had data in the format "city, st zip" with the quotes and comma. Without those fields the data imported without error so the problem is obviously there. The problem is I don't know how to handle this. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.