Jump to content

CSV to MySQL import appears slow


KirkLogan

Recommended Posts

I am importing a CSV file into a MySQL Database (about 100,000 rows of data) and it seems oddly slow.

 

I am running on a local WAMP server and I have tried optimizing my memory limits and whatnot however it does not appear to make much of a difference. (and yes I restarted the server)

 

I am only seeing about 1,500 rows/minute give or take and this seems oddly slow to me.

 

I would truly appreciate it if someone could take a quick look at my code or throw me a recommendation for speeding this process up.

 

<?php include("_include/functions.html");?>
<?php include("_include/constants.html");?>
<?php include("_include/connection.html");?>
<?php
	if(isset($_FILES['file'])) {
		
		$file = $_FILES['file']['tmp_name'];
		$table = "job";
			
		$handle = fopen($file, "r");
		
		// Read first (headers) record only)
		$data = fgetcsv($handle, 1000, ",");
		
		$query = "DROP TABLE IF EXISTS {$table}";
		$result = mysql_query($query, $connection);
		
		$query = "CREATE TABLE {$table} (";
		
		for($i=0;$i<count($data); $i++) {
		$input = preg_replace('/\s+/', '_', $data[$i]);
		$filter1 = str_replace(",", "", $input);
		$filter2 = str_replace(".", "", $filter1);
		$filter3 = str_replace("/", "", $filter2);
		$filter4 = str_replace("'\'", "", $filter3);
		$filter5 = str_replace("-", "", $filter4);
		$query .= $filter5.' VARCHAR(100), ';
		$table_headings[$i] = $filter5;
		}
		
		//The line below gets rid of the comma
		$query = substr($query,0,strlen($query)-2);
		$query .= ')';
		
		// Execute query
		if (mysql_query($query,$connection))
		  {$message = "Table {$table} created successfully";}
		else 
		  {$message = "Error creating table: " . mysql_error(); break;}
		
		$query = "INSERT INTO {$table} (";
		for($i=0;$i<count($data); $i++) {
			$query .= $table_headings[$i].", ";
		}
		//The line below gets rid of the comma
		$query = substr($query,0,strlen($query)-2);
		$query .= ") VALUES (";
		
		while(($fileop = fgetcsv($handle,1000, ",")) !== false) {
			
			$appended = "";
			
			//Build second part of query
			for($i=0;$i<count($data); $i++) {
				$appended .= "'{$fileop[$i]}', ";
			}
			
			//The line below gets rid of the comma
			$appended = substr($appended,0,strlen($appended)-2);
			$appended .= ")";
			
			$final_query = $query;
			$final_query .= $appended;
			
			$result = mysql_query($final_query, $connection);
		}
		
		fclose($handle);

	}
?>

 

 

Link to comment
https://forums.phpfreaks.com/topic/275283-csv-to-mysql-import-appears-slow/
Share on other sites

Sorry to unsolve, i just have one further question.


Some of my .CSV fields contain commas (name fields: Last, Frst MI)

 

Is there a good way to keep the field from terminating when it hits that comma?

 

Edit: I read the material on ENCLOSED BY. However the .CSV im working with does not enclose its commas in "" marks. All strings within the .CSV are enclosed in "" marks however.

 

Edit: Sorry, was quick to post. Solved using the OPTIONALLY ENCLOSED BY parameter. Thanks again.

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.