mrbuter Posted August 13, 2008 Share Posted August 13, 2008 What is the best way to do this? I made a script which makes the tables etc. Works fine. Adding info to them works well too (INSERT INTO). However....I have one table which needs a _MASSIVE_ ammount of data put into it. (42,000 some rows). On my localhost server I get a timeout error (set to 30 seconds) but what would you recommend? Splitting it up wouldn't be too useful since it would probably be like 200 steps lol. PHP has no notion of time right? So I can't add like any "wait" commands every 100 or so insert queries? Quote Link to comment Share on other sites More sharing options...
genericnumber1 Posted August 13, 2008 Share Posted August 13, 2008 change your php.ini setting max_execution_time to something larger. Quote Link to comment Share on other sites More sharing options...
trq Posted August 13, 2008 Share Posted August 13, 2008 Are you talking about making this a web based install script? Your definately going to have a hard time if your script is timing out. You could try increasing set_time_limit(). Quote Link to comment Share on other sites More sharing options...
monkeypaw201 Posted August 13, 2008 Share Posted August 13, 2008 Just embed bigdump Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.