Jump to content

Recommended Posts

Hi!

 

I have a problem I need help with.

My script downloads a sql file from a popular game, and the sql file is a exprted table.

Now I need to import this table into my own database to make it easy to read.

I tried to read the sql file line by line and query it, but it took quite long and I got a error saying I can only query 30 times in a second.

Is there any easy way to import the table into my database?

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/
Share on other sites

Usually an exported table is just 2 querys (CREATE IF NOT EXISTS and a huge INSERT INTO).

Read those into a string then split them e.g $createTable and $tableData then execute them.

Querys are sperated by a semicolon that should be enough to split the querys.

 

Notice that a query often has more then 1 line!

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364207
Share on other sites

This is not 2 querys, its about ~5000 rows of "INSERT INTO".

 

 

INSERT INTO `x_world` VALUES (477901,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',212);

INSERT INTO `x_world` VALUES (477946,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',220);

INSERT INTO `x_world` VALUES (477974,104,-196,3,50503,'11',1382,'XXXX,30,'Idle',288);

INSERT INTO `x_world` VALUES (477971,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',270);

...

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364231
Share on other sites

By looking at sqlyog I think I need to give more info.

The script that downloads the sql file runs every day, and therefore I need the "sql table uploader" to run every day automaticly.

sqlyog looks more like a admin tool where you can do it yourself.

Please explain more about them.

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364236
Share on other sites

Sorry I didnot realise you wanted to do it every day thougth it was a one off.

I do the same thing everyday with a csv file and it take sabout 15 mins to do 30,000 request import and update.

I guess its how good your server is if its a small server with low resources then it will take longer than a higher powered high resource machine.

 

Maybe its not the code its the server ?  maybe try and put a pause in if it errors out or change the php.ini setting to allow persistant connections allow more than 30 queries a second

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364244
Share on other sites

Surely the fastest way would be to execute the sql file using mysql from the command line.

 

If he's limited to 30 queries a second (odd), I'd assume this is a MySQL limit and not a PHP/driver one. Not sure if the command line would bypass this.

Yep, there is a MySQL limit  :(

I he/she has the binaries files, this job will be done for few seconds.

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364328
Share on other sites

Surely the fastest way would be to execute the sql file using mysql from the command line.

 

Whats the easiest way to do it?

You can see what the file looks like by scrolling up, 5000 insert statements.

 

And no, I have no binary files or anything else, just a .sql file.

 

 

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364356
Share on other sites

Surely the fastest way would be to execute the sql file using mysql from the command line.

 

Whats the easiest way to do it?

You can see what the file looks like by scrolling up, 5000 insert statements.

 

And no, I have no binary files or anything else, just a .sql file.

 

From the mysql manual

 

However, it is also possible to put your SQL statements in a file and then tell mysql to read its input from that file. To do so, create a text file text_file that contains the statements you wish to execute. Then invoke mysql as shown here:

 

shell> mysql db_name < text_file

 

If you place a USE db_name statement as the first statement in the file, it is unnecessary to specify the database name on the command line:

 

shell> mysql < text_file

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364363
Share on other sites

What binary files? The actual table files stored on the hard drive? Generally, you're better off not touching those.

Few months ago I imported the Bible to database in this way. 

What I'm talking about:

 

 

 

Generally, you're better off not touching those... especially in a production server.

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364365
Share on other sites

Surely the fastest way would be to execute the sql file using mysql from the command line.

 

Whats the easiest way to do it?

You can see what the file looks like by scrolling up, 5000 insert statements.

 

And no, I have no binary files or anything else, just a .sql file.

 

@Worqy, do you have any access to database with this game ? How did you download the files ? 

 

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364368
Share on other sites

This is not 2 querys, its about ~5000 rows of "INSERT INTO".

 

 

INSERT INTO `x_world` VALUES (477901,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',212);

INSERT INTO `x_world` VALUES (477946,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',220);

INSERT INTO `x_world` VALUES (477974,104,-196,3,50503,'11',1382,'XXXX,30,'Idle',288);

INSERT INTO `x_world` VALUES (477971,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',270);

...

 

If that is what the entire fire looks like then you can accomplish this with just a couple lines of code to convert all the multi-line inserts into a single insert for all records.

 

$values = file_get_contents($file);
$values = str_replace('INSERT INTO `x_world` VALUES', '', $values); //Remove individual inserts
$values = str_replace(';', ', ', $values); //Convert semi-colons to commas
$values = substr($values, 0 , -1); //Remove last comma
$query = "INSERT INTO `x_world` VALUES " . $values; //Create single insert statement

 

Not tested, but that should work by converting the queries into one insert

INSERT INTO `x_world` VALUES (477901,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',212),
(477946,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',220),
(477974,104,-196,3,50503,'11',1382,'XXXX,30,'Idle',288),
(477971,104,-196,3,50503,'11',1382,'XXXX',30,'Idle',270)

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364370
Share on other sites

Thanks for your help. But there is one problem with your script, the query ends with a ",".

Like this:

), (607138,380,-357,3,48950,'XXXXX',10291,'XXXXXX',306,'idefix',492), (607924,365,-358,3,50822,'XXXX,6283,'XXXX',306,'idle',167),

 

The script must somehow know when the last line is written, and then remove the ","

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364377
Share on other sites

Gotta watch out for huge inserts though, depending on the size of each row, inserting over 5k rows can leave you with a 'gone away' server :(

 

You can take that code, and explode by ';' to get an array of the values. Then it's quite easy to make groups of ~1000, and perform a query for that chunk.

 

To get rid of the trailing ',' from his example

 

$values = substr(rtrim($values), 0 , -1)

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364378
Share on other sites

I provided comments on what each line was supposed to do. Apparently there were other characters after the last comma which prevented it from being removed. You could have figured that out yourself. But, no matter,Here's an example using explode as xyph suggested AND breaking it out into 1,000 chunks

 

Again, not tested, I leave it to you to figure out any minor problems

$insertCount = 1000;

$stringData = file_get_contents($file);
$stringData = str_replace('INSERT INTO `x_world` VALUES', '', $stringData); //Remove individual inserts

$valuesAry = explode(';', $stringData); //Convert into array using semi-colons

foreach(array_chunk($values, $insertCount) as $insertAry)
{
    //Create insert statement for $insertCount records
    $query = "INSERT INTO `x_world` VALUES " . implode(', ' $insertAry);
    $result = mysql_query($query) or die(mysql_error());
}

Link to comment
https://forums.phpfreaks.com/topic/266210-import-table/#findComment-1364387
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.