Search the Community
Showing results for tags 'csv'.
-
I am importing a text/csv file and instead of importing all of the 'columns' (there are 15 columns on each line), I would like to import column number 1, 4, 15. I have the following code that will import the whole file into an array which works fine but i don't want the whole file in the array becuase I also need to do something like array_unique() where i filter out any duplicates once I have my 3 columns imported into an array. My question is: 1. How do i import the file to only read in the columns i want 2. Once I have all the data into an array , how do i get all the unique rows (so in theory the array size could be halved) Any help greatly appreciated # Open the File. if (($handle = fopen($files, "r")) !== FALSE) { # Set the parent multidimensional array key to 0. $nn = 0; while (($data = fgetcsv($handle, 1000, "|")) !== FALSE) { # Count the total keys in the row. $c = count($data); # Populate the multidimensional array. for ($x=0;$x<$c;$x++) { $csvarray[$nn][$x] = $data[$x]; } $nn++; } # Close the File. fclose($handle); //rename('imports/' . $files, 'archived/' . $files); }
-
I have a dynamic php page that has a series of select boxes at the top. When the user changes what is in the select boxes a div at the bottom of the page gets populated with a table via an ajax call to a php script. I am need of a button on that page that a user can click that will give them the option to save that displayed table as a .csv file. I have been looking all over for a solution that will work in both ie and ff, and am not coming up with anything. Any help would be great. Thanks!
-
I have this import code and it works if I take the header out of the csv file, but I would like to keep the header in the file so I can keep a template for my co-workers to use so there is no issues with what columns are which. Here is my code: <?php /* Format the errors and die */ function get_last_error() { $retErrors = sqlsrv_errors(SQLSRV_ERR_ALL); $errorMessage = 'No errors found'; if ($retErrors != null) { $errorMessage = ''; foreach ($retErrors as $arrError) { $errorMessage .= "SQLState: ".$arrError['SQLSTATE']."<br>\n"; $errorMessage .= "Error Code: ".$arrError['code']."<br>\n"; $errorMessage .= "Message: ".$arrError['message']."<br>\n"; } } die ($errorMessage); } /* connect */ function connect() { if (!function_exists('sqlsrv_num_rows')) { // Insure sqlsrv_1.1 is loaded. die ('sqlsrv_1.1 is not available'); } /* Log all Errors */ sqlsrv_configure("WarningsReturnAsErrors", TRUE); // BE SURE TO NOT ERROR ON A WARNING sqlsrv_configure("LogSubsystems", SQLSRV_LOG_SYSTEM_ALL); sqlsrv_configure("LogSeverity", SQLSRV_LOG_SEVERITY_ALL); include '/includes/db_connect.php'; if ($connection === FALSE) { get_last_error(); } return $connection; } function query($connection, $query) { $result = sqlsrv_query($connection, $query); if ($result === FALSE) { get_last_error(); } return $result; } /* Prepare a reusable query (prepare/execute) */ function prepare ( $connection, $query, $params ) { $result = sqlsrv_prepare($connection, $query, $params); if ($result === FALSE) { get_last_error(); } return $result; } /* do the deed. once prepared, execute can be called multiple times getting different values from the variable references. */ function execute ( $stmt ) { $result = sqlsrv_execute($stmt); if ($result === FALSE) { get_last_error(); } return $result; } function fetch_array($query) { $result = sqlsrv_fetch_array($query, SQLSRV_FETCH_ASSOC); if ($result === FALSE) { get_last_error(); } return $result; } $connection = connect(); /* prepare the statement */ $query = "INSERT Records values ( ? , ? , ? )"; $param1 = null; // this will hold col1 from the CSV $param2 = null; // this will hold col2 from the CSV $param3 = null; // this will hold col3 from the CSV $params = array( &$param1, &$param2, &$param3 ); $prep = prepare ( $connection, $query, $params ); // $result = execute ( $prep ); //get the csv file $file = $_FILES["importrecords"]["tmp_name"]; /* Here is where you read in and parse your CSV file into an array. That may get too large, so you would have to read smaller chunks of rows. */ $csv_array = file($file); $_SESSION['records_row_count'] = 0; foreach ($csv_array as $row_num => $row) { $_SESSION['records_row_count']++; $row = trim ($row); $column = explode ( ',' , $row ); $param1 = $column[0]; $param2 = $column[1]; $param3 = $column[2]; // insert the row $result = execute ( $prep ); } /* Free statement and connection resources. */ sqlsrv_close($connection); header( "Location: import.php?successrecords=1" ); ?> How would I accomplish leaving the header in the csv file but skipping on the import? I tried adding a count, but that didn't work. Thanks for the help in advance!
-
I have found some code that I am trying to work with to import some data from a CSV file into a table of my database. It works and imports the data, but always inserts a NULL row for each column. Here is the code: <?php $self = $_SERVER['PHP_SELF']; $request = $_SERVER['REQUEST_METHOD']; if (!isset($_GET['success'])) { $get_success = ""; } else { $get_success = $_GET['success']; } if (!empty($_FILES)) { /* Format the errors and die */ function get_last_error() { $retErrors = sqlsrv_errors(SQLSRV_ERR_ALL); $errorMessage = 'No errors found'; if ($retErrors != null) { $errorMessage = ''; foreach ($retErrors as $arrError) { $errorMessage .= "SQLState: ".$arrError['SQLSTATE']."<br>\n"; $errorMessage .= "Error Code: ".$arrError['code']."<br>\n"; $errorMessage .= "Message: ".$arrError['message']."<br>\n"; } } die ($errorMessage); } /* connect */ function connect() { if (!function_exists('sqlsrv_num_rows')) { // Insure sqlsrv_1.1 is loaded. die ('sqlsrv_1.1 is not available'); } /* Log all Errors */ sqlsrv_configure("WarningsReturnAsErrors", TRUE); // BE SURE TO NOT ERROR ON A WARNING sqlsrv_configure("LogSubsystems", SQLSRV_LOG_SYSTEM_ALL); sqlsrv_configure("LogSeverity", SQLSRV_LOG_SEVERITY_ALL); $conn = sqlsrv_connect('cslogs', array ( 'UID' => 'mailreport', 'PWD' => '123456', 'Database' => 'Mail', 'CharacterSet' => 'UTF-8', 'MultipleActiveResultSets' => true, 'ConnectionPooling' => true, 'ReturnDatesAsStrings' => true, )); if ($conn === FALSE) { get_last_error(); } return $conn; } function query($conn, $query) { $result = sqlsrv_query($conn, $query); if ($result === FALSE) { get_last_error(); } return $result; } /* Prepare a reusable query (prepare/execute) */ function prepare ( $conn, $query, $params ) { $result = sqlsrv_prepare($conn, $query, $params); if ($result === FALSE) { get_last_error(); } return $result; } /* do the deed. once prepared, execute can be called multiple times getting different values from the variable references. */ function execute ( $stmt ) { $result = sqlsrv_execute($stmt); if ($result === FALSE) { get_last_error(); } return $result; } function fetch_array($query) { $result = sqlsrv_fetch_array($query, SQLSRV_FETCH_ASSOC); if ($result === FALSE) { get_last_error(); } return $result; } $conn = connect(); /* prepare the statement */ $query = "INSERT Records values ( ? , ? , ? )"; $param1 = null; // this will hold col1 from the CSV $param2 = null; // this will hold col2 from the CSV $param3 = null; // this will hold col3 from the CSV $params = array( $param1, $param2, $param3 ); $prep = prepare ( $conn, $query, $params ); $result = execute ( $prep ); //get the csv file $file = $_FILES["csv"]["tmp_name"]; /* Here is where you read in and parse your CSV file into an array. That may get too large, so you would have to read smaller chunks of rows. */ $csv_array = file($file); foreach ($csv_array as $row_num => $row) { $row = trim ($row); $column = explode ( ',' , $row ); $param1 = $column[0]; $param2 = $column[1]; $param3 = $column[2]; // insert the row $result = execute ( $prep ); } /* Free statement and connection resources. */ sqlsrv_close($conn); header( "Location: test.php?success=1" ); } ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" /> <title>Import a CSV File with PHP & MS SQL Server</title> </head> <body> <?php if (!empty($get_success)) { echo "<b>Your file has been imported.</b><br><br>"; } //generic success notice ?> <form action="" method="post" enctype="multipart/form-data" name="form1" id="form1"> Choose your file: <br /> <input name="csv" type="file" id="csv" /> <input type="submit" name="Submit" value="Submit" /> </form> </body> </html> Now I know it is at this part wiht the $param1, $param2 and $param 3 that it is inserting the NULL values for each column: /* prepare the statement */ $query = "INSERT Records values ( ? , ? , ? )"; $param1 = null; // this will hold col1 from the CSV $param2 = null; // this will hold col2 from the CSV $param3 = null; // this will hold col3 from the CSV $params = array( $param1, $param2, $param3 ); $prep = prepare ( $conn, $query, $params ); $result = execute ( $prep ); But if I take that out them three out, the php errors out then doesn't import the data. Is there a way with what I have to ignore the first row to import? Or am I going about this all wrong?
-
I'm haveing a heck of a time trying to figure this out. All I want to do is save a SELECT array (mysql) to a CSV file. Here's what I'm trying. It $result = "SELECT product.reference AS sku, product.price - specific_price.reduction AS price, stock_available.quantity, marketplace_product_option.asin1 AS 'product-id', 'ASIN' AS 'product-id-type', product.condition AS 'condition-type' FROM product, specific_price, stock_available, marketplace_product_option WHERE product.id_product = specific_price.id_product AND product.id_product = stock_available.id_product AND product.id_product = marketplace_product_option.id_product;"; $fp = fopen('/home/aonewebs/public_html/amazon-export.csv', 'w+'); if ($fp && $result) { while ($row = mysql_fetch_row($result)) { fputcsv($fp, array_values($row)); } } creates an empty file (even though therer are 3000 lines to that array. I'm sure I'm missing something obvious. Any and all help greatly appreciated. Ron
-
Hi everyone, I got trouble uploading a csv file in php. Cant seem to figure out whats wrong, it always lead me to "nicetry" even though im sure im uploading a csv file. upload_page <?php echo "<form action='upload_process.php' id='upload_form' method='post' enctype='multipart/form-data'>"; echo "<table id='class1' align='center'>"; echo "<tr>"; echo "<th colspan=2>Upload</th>"; echo "</tr>"; echo "<tr>"; echo "<td colspan=2><input type='file' name='file' id='file' size='50'/></td>"; echo "</tr>"; echo "<tr class='alt'>"; echo "<td colspan=2><input type='submit' name='upload' value='upload' /></td>"; echo "</tr>"; echo "</table>"; echo "</form>"; ?> upload_process $mimes = array('application/vnd.ms-excel','text/plain','text/csv','text/tsv'); if(in_array($_FILES['file']['type'],$mimes)){ echo "csv file!"; } else { echo "nicetry"; } any input would be very much appreciated. thanks!
-
Hello - First of all, I have low->medium knowledge of php, but it hasn't been used for allmost 10 years - But I'm all stuck here! I hope someone can help me, I'm pretty sure I will be getting retarded soon.. I have a CSV-file that has a structure like this: street, status, project Road Street 2, Possible, HyperGalaxy Street Road 4, Ready, Galaxy Road Street 2, Unknown, Unkown I've tried to import it into excel and making a macro, but after 3-5 hours of processing, my computer crashes and I have to start all over. My goal is that "Road Street 2, Unknown, Unknown" is deleted because of the status. - Possible is "better" than unknown - in this case.. I have around 300.000 rows and I know there will be approx. 10.000 duplicated values (and therefore 5.000 unique) Are there any way of deleting duplicates by a "condition"?
- 19 replies
-
- csv
- dublicates
-
(and 3 more)
Tagged with:
-
Is there a php class to update a csv file? I'm not very well versed with fputcsv, but looking at the documentation it doesn't seem to give me what I want. I see that I pass it the file handler and an array of data to write to the csv. I want to update one column of data at a time not an entire row. I'm stuck on where to go from here. In one csv I have a product sku that is located in column 9 and the other csv column 13. These columns are the identifying info I need to know I am looking at the same product on both csv's. The "In" csv is the origin which contains the inventory quantity. I need to transfer that quantity to the "out" csv. I figure the way to do this is by reading the origin line by line and placing the sku and the inventory in an array. I then loop through that array and read the "out" csv and search for a match. Once I find a match I need to update that products' inventory column from the origin csv. Makes sense? Any help would be helpful. This is the code I have so far, probably far off from what I need. <?php error_reporting(E_ALL); ini_set('display_errors', '1'); ini_set('auto_detect_line_endings', true); if (empty($_GET['in'])) die("Please pass file name to check inventory."); if (empty($_GET['out'])) die("Please pass file name to update inventory."); $in = $_GET['in'] . ".csv"; $out = $_GET['out'] . ".csv"; $h_in = fopen($in, "r"); if (!$h_in) die("$in can't be read."); $h_out = fopen($out, "r+"); if (!$h_out) die("$out can't be opened to read and write"); $rsr_inv = array(); $productskus = array(); $products = array(); $flag = true; while (false !== ($row = fgetcsv($h_out))) { if($flag) { $flag = false; continue; } $productskus[] = $row[13]; $products[] = $row; } while (false !== ($row = fgetcsv($h_in))) { $key = array_search($row[0], $productskus); if ($key) { $rsr_inv[] = array($productskus[$key], $row[8]); } } fclose($h_in); foreach($products as $key => $prod){ echo $inv[0] . ":" . $products[$key][13] . "<br>"; //if($inv[0] == $products[$key][13]){} } fclose($h_out);
-
I am trying to extract csv attachments from emails that have been piped to a php script. The pipe is working, and for the most part the php script is working as well. Unless someone sends the csv file from a source such as an apple iphone. Then the script begins on the wrong line to start the data extraction. Can anyone help me fine tune this to work cleaner?? OR is there simply a better way to extract an attachment from a piped email? Thanks! #!/usr/bin/php -q <?PHP ## sql connection credentials go here. $fd = fopen("php://stdin", "r"); if ($fd) { $t='0'; while (($fde = fgets($fd, 4096)) !== false) { ## Opens the email and saves each line into an array called $email ## $email[$t] = $fde; # $message .= $fde; $t++; } fclose($fd); for ($h='0'; $h<$t; $h++) { ## Splits each line by comma and checks position [0] for specific wording the first line in the csv file. saves line position once found ## $output = explode(",",$email[$h]); if ($output[0] == "item location") { $j = $h+2; # on line $j because that is where I found the item location } } $l='0'; for ($b=$j; $b<$t; $b++) { $pos2 = strrpos($email[$b], "_NextPart_"); if ($pos2 === false) { $pos = strrpos($email[$b], "="); if ($pos === false) { $themail[$l] = $email[$b]; } else { # Addresses a problem of = signs showing up. if I see an = sign combine the next two rows into one row and remove the equals sign. AND add 2 to $b instead of 1 $z=$b+1; $e = array('"', '='); $themail[$l] = "".str_replace($e, '', $email[$b])."".$email[$z].""; $b++; } } else { $b=$t; } $l++; } ## starting point is $j $i=0; for ($k=0; $k<$l; $k++) { ## Splits each line by comma and saves each item from each line to an array ## $output = explode(",",$themail[$k]); foreach ($output as $value) { $foutput[$i] = $value; $i++; } } $r='0'; if ($l > 50) { ## this is done to limit the number of rows being uploaded at one time to 50 max. For some reason, the script starts to get out of sync after 55 rows are processed. 50 seems appropriate anyway ## $l = 50; } for ($e='0'; $e<$l; $e++) { #echo "<br>Set $e <br>"; $r=$e*8; $s='0'; for ($w=$r; $w<$r+8; $w++) { $line[$s] = $foutput[$w]; ## upload the data here ## $s++; } ## sql SELECT statement looks for existing matching data, if it find it then it UPDATEs the line, if not then it INSERTs the line. ?>
- 2 replies
-
- csv
- piped emails
-
(and 1 more)
Tagged with:
-
Hi I have made the code below that outputs the contents of a CSV file into a table. The code works fine however on my server host CGI error log, it says PHP Warning: Invalid argument supplied for foreach() in "my_php_file_name" on line 83 do { $array = fgetcsv($filehandle, 65, ","); echo "<tr>"; foreach ($array as $tableformat) { echo "<td>".$tableformat."</td>"; } echo "</tr>"; $nextrow++; } while ($nextrow <= $lines); $lines refers to the number of lines in the CSV file and $nextrow represents the integer value 0. Although it is working, could someone please explain to me why this message appears so that I can correct it and learn from this experience. Thank you.
-
I'm trying to filter out the rows of a uploaded csv that have the word "lokaal" (classroom) in the 3th colom, but this doesn't work. What does work: When I search for just one letter (for example the o) $findneedle is true, so it works Putting $haystack="lokaal 5"; in the coding also works (but that is just usable for testpurposes) echo $haystack does result in a correct value, so the csv is read correctly <? $handle = fopen($_FILES['file']['tmp_name'], "r"); $data = fgetcsv($handle, 1000, ";"); //Remove if CSV file does not have column headings while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) { $haystack=$data[2]; $needle='lokaal'; $findneedle=strpos($needle, $haystack); if($findneedle !== false) { //rest of the coding } ?> The CSV looks like this 2013;Class 1;Zelfstudie;31-12-2013 0:00:00;1-1-1900 9:00:00;1-1-1900 12:00:00; 2013;Class 2;lokaal 5;27-12-2013 0:00:00;1-1-1900 15:00:00;1-1-1900 17:00:00; Am I overlooking something? I hope someone can help me.
-
Hi All, I want to create a webpage that allows a user to import a csv file into a database. Can anyone help me start with this? Or at least refer me to a really good source that can help me get this working.
-
My brain is currently at it's threshold for processing this sort of thing on it's own, and I've been circling this challenge for hours with no success. Is there a way to take a JSON file (http://us.battle.net/auction-data/6de1a23eb7316db413a62771c8f16b93/auctions.json) And turn it into something I can use with PHP to reorganize and display the data in a way that makes sense? IE: Realm - Player Name - Item ID - Quantity - Bid - Buyout - Timeleft Preferrably in a CSV or MySQL DB so I ca simply foreach() the entire thing I apologise if I am not making a whole lot of sense, I am just at my wit's end.
-
Hi hopefully someone can help, i'm having a slight panic on a project I'm working on. I've got 90% of this all done but am having a few concerns regarding some special characters. Heres' the situation. I have a database table i need to export to a CSV (so the client can then import into their own system) I've got this working a dream, downloads the file and all that perfectly. Now i have a few fields in this table where we're going to get special characters (double quotes, single quotes etc) just the kind of thing that will bust a CSV from importing (or opening in excel/similar properly). Basically what i could do with knowing is should i just clean the strings of the bad characters? possibly loosing some of the meaning of the data (for instance one field is a message, and this is likely to contain all sorts of characters), or is there something else i can do that will retain the data - i'm thinking the brute force clean could be fine, but would like to know my options. Also could do with a hand with regards to cleaning the string, I've had a look at preg_replace, example here: $fnameClean = preg_replace("/'/", "", $fnameDirty); Really hope this is a simple one guys, thanks in advance.
- 5 replies
-
- csv
- special characters
-
(and 2 more)
Tagged with:
-
HI all, I am using php code to write into my csv file , its able to write successfully but every time i run my code the older data is lost and csv file is left with data which only the current run contains. I used this <? $list = array ( array('aaa', 'bbb', 'ccc', 'dddd'), array('123', '456', '789'), array('"aaa"', '"bbb"') ); $fp = fopen('test.txt', 'w'); foreach ($list as $fields) { fputcsv($fp, $fields); } fclose($fp); ?> Any suggestions??? Many thanks .
-
Hi... A scenario: A client has their own database. I don't know what what kind of database just yet. Waiting to hear back. But they want put a csv file or some kind of mysql dump file outside of their firewall every friday for us to use to populate a mysql database on a different server where their website is hosted - where we are building their website. Can this be done feasibly? How would we do this?
-
Hi Everyone Struggling with something that should be incredibly simple I need to search through a csv file for a certain string and then return the rows which contain that string and only certain columns for that row So I have the simple part to display only the relevant columns but I cant find any info on how to search and only return the relevant rows? Your assistance would be highly appreciated. <?php if (($handle = fopen("filename.csv", "r")) !== FALSE) { while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) { echo "$data[1] $data[2] $data[3] $data[18] $data[34] <br />\n"; } fclose($handle); } ?>
-
Hi guys, I am using this code: function importCsv($file, $table) { $fieldseparator = ","; $lineseparator = "\n"; $fileOpen = fopen($file, "r"); $fileSzie = filesize($file); $fileContent = fread($fileOpen,$fileSize); fclose($fileOpen); $fileLines = 0; $fileQuiries = ""; $fileLineArray = array(); foreach(explode($lineseparator,$fileContent) as $fileLine) { $fileLines++; $fileLine = trim($fileLine," \t"); $fileLine = str_replace("\r","",$fileLine); $fileLine = str_replace("'","\'",$fileLine); $fileLineArray = explode($fieldseparator,$fileLine); $fileLineMySql = implode("','",$fileLineArray); $query = "INSERT INTO `$table` VALUES(NULL,'$fileLineMySql')"; $mysqli->query($query); } } My issue is that the field contains a comma (",") and this stuffs up the INSERT. How can I escape this comma so the code does not think it is a different mysql table column? Thanks in advance.