Jump to content

Insert data into 100 million rows without 500 error?


casualventures

Recommended Posts

So when I execute this it'll go for about 130k rows then give a 500 error. Any ideas how to avoid this and get it to complete?

 

$i=10000000;
while($i<=99999999)
  { 
  $public = $i;
  $value = rand(10000000, 99999999);  
  mysql_query("INSERT INTO yummy_table (public, value) VALUES('$public', '$value' ) ") or die(mysql_error());  
  $i++;
  }
echo "done";

  Quote

Is there a particular reason your executing this script via a web server instead of just scripting it behind the scenes?

 

Because it needs to happen when a web app is getting installed via the browser. Is there a different way you would suggest I go about it?

Why wouldn't you just generate each row as you need it? Add a row with the next public value with a generated random value? The result will be the same.

 

To get your existing code to operate several times faster would require that you use a multi-value insert query -

 

INSERT INTO yummy_table (public, value) VALUES (x,y),(),(),(),...

 

You would build a query string with as many (x,y) values as you can (there is a limit to the size of each query string.) By putting 10k - 100k values in each query, you will reduce the number of queries.

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.