paulmo Posted November 16, 2008 Share Posted November 16, 2008 trying to load maxmedia geocity lite's (free) large csv files using a blogger's php code (have x'd values). i've ftp uploaded this file to server, and created link to it from webpage, thinking that would "activate" it somehow (i'm a newbie). in phpadmin/mysql, i'm creating "csv" table with 1 field and naming it "blocks" although that's probably not the right thing to do (would rather have the code populate/make the table, but don't know how to do that). anyhow the csv file is freezing/not loading through import. using php/mysql on local machine was a headache previously so i'm doing all of this via remote server (ftp/phpadmin/mysql=godaddy). please help, thanks. $databasehost = “xxx”; $databasename = “xxx”; $databasetable = “csv”; $databaseusername =”xxx”; $databasepassword = “xxx”; $fieldseparator = “,”; $lineseparator = “\n”; $csvfile = “GeoLiteCity-Blocks.csv”; /********************************************************************************************/ /* Would you like to add an ampty field at the beginning of these records? /* This is useful if you have a table with the first field being an auto_increment integer /* and the csv file does not have such as empty field before the records. /* Set 1 for yes and 0 for no. ATTENTION: don’t set to 1 if you are not sure. /* This can dump data in the wrong fields if this extra field does not exist in the table /********************************************************************************************/ $addauto = 0; /********************************************************************************************/ /* Would you like to save the mysql queries in a file? If yes set $save to 1. /* Permission on the file should be set to 777. Either upload a sample file through ftp and /* change the permissions, or execute at the prompt: touch output.sql && chmod 777 output.sql /********************************************************************************************/ $save = 1; $outputfile = “output.sql”; /********************************************************************************************/ if(!file_exists($csvfile)) { echo “File not found. Make sure you specified the correct path.\n”; exit; } $file = fopen($csvfile,”r”); if(!$file) { echo “Error opening data file.\n”; exit; } $size = filesize($csvfile); if(!$size) { echo “File is empty.\n”; exit; } $csvcontent = fread($file,$size); fclose($file); $con = @mysql_connect($xxx,$xxx,$xxx) or die(mysql_error()); @mysql_select_db($xxx) or die(mysql_error()); $lines = 0; $queries = “”; $linearray = array(); foreach(split($lineseparator,$csvcontent) as $line) { $lines++; $line = trim($line,” \t”); $line = str_replace(”\r”,”",$line); /**********************************************************************************************/ This line escapes the special character. remove it if entries are already escaped in the csv file ***********************************************************************************************/ $line = str_replace(”‘”,”\’”,$line); /**********************************************************************************************/ $linearray = explode($fieldseparator,$line); $linemysql = implode(”‘,’”,$linearray); if($addauto) $query = “insert into $databasetable values(”,’$linemysql’);”; else $query = “insert into $databasetable values(’$linemysql’);”; $queries .= $query . “\n”; @mysql_query($query); } @mysql_close($con); if($save) { if(!is_writable($outputfile)) { echo “File is not writable, check permissions.\n”; } else { $file2 = fopen($outputfile,”w”); if(!$file2) { echo “Error writing to the output file.\n”; } else { fwrite($file2,$queries); fclose($file2); } } } echo “Found a total of $lines records in this csv file.\n”; ?> Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/ Share on other sites More sharing options...
Mchl Posted November 16, 2008 Share Posted November 16, 2008 Try LOAD DATA INFILE syntax Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691295 Share on other sites More sharing options...
paulmo Posted November 16, 2008 Author Share Posted November 16, 2008 thanks, how to do that please? Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691302 Share on other sites More sharing options...
Mchl Posted November 16, 2008 Share Posted November 16, 2008 MySQL manual has some info Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691304 Share on other sites More sharing options...
paulmo Posted November 16, 2008 Author Share Posted November 16, 2008 ok i found the page: http://dev.mysql.com/doc/refman/5.0/en/load-data.html guessing i do that in the "sql query" box? since csv file is on my computer, how is this loaddata method going to get the file from my computer? there's no mention of an upload method here; it seems to be written as if the file is already in the database. Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691310 Share on other sites More sharing options...
Mchl Posted November 16, 2008 Share Posted November 16, 2008 If you use PhpMyAdmin, you can do this in 'Import' tab, where you can upload file to server. Otherwise you must upload file using ftp. If the file is large, you might have to split it into several smaller, because there probably is a limit set up on your host for both upload file size and script execution time. Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691316 Share on other sites More sharing options...
paulmo Posted November 17, 2008 Author Share Posted November 17, 2008 yes there is a limit; 2,000 kbs in import. the files are 99k and 11k. how to split up csv files. thanks Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691755 Share on other sites More sharing options...
corbin Posted November 17, 2008 Share Posted November 17, 2008 You could just grant an account with your IP or a wild card host access. Then, you could use the mysql shell tool (mysql or mysql.exe) to connect to it. Then you could issue the LOAD DATA command from the shell, and it could use a local file. Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691801 Share on other sites More sharing options...
Mchl Posted November 17, 2008 Share Posted November 17, 2008 Alternatively, if you have access to PhpMyAdmin's config.inc.php file, you can set up an upload directory in $cfg['UploadDir'] Then upload your files to this directory using ftp. You will then be able to select them within your import dialog. Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691895 Share on other sites More sharing options...
paulmo Posted November 17, 2008 Author Share Posted November 17, 2008 thanks mchl and corbin: those sound like good things to learn. i don't know the first thing about granting an ip acc't or wild card access. i'm guessing this method involves queries in the sql window, which i know little about unless i see the code. the second option sounds good to learn too: i'm unsure about the config file or how to find that in phpmyadmin, and the steps to set up a directory. thanks for advice. i'm new at this besides creating one table that gets values (insert into) from a form. Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-691974 Share on other sites More sharing options...
corbin Posted November 17, 2008 Share Posted November 17, 2008 You could use PHPMyAdmin to set the permissions. Quote Link to comment https://forums.phpfreaks.com/topic/132936-large-csv-file-to-mysql-using-php/#findComment-692323 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.