tony s Posted March 7, 2006 Share Posted March 7, 2006 case is:i have large csv-file (about 41.500 rows). and i need to transfer it to database, now the problem is that only get about 37.600 rows to the db. i'm thinking that the main reason is the time limit for executing php-file. [i](30 sec. or so)[/i]Now is it possible to use some php function to extend the time limit? Quote Link to comment Share on other sites More sharing options...
shocker-z Posted March 7, 2006 Share Posted March 7, 2006 I found the following reference may help..[a href=\"http://uk.php.net/set_time_limit\" target=\"_blank\"]http://uk.php.net/set_time_limit[/a] Quote Link to comment Share on other sites More sharing options...
txmedic03 Posted March 7, 2006 Share Posted March 7, 2006 Yes, it is possible to extend the max execution time of a script, but alternatively, how about one script that parses your data in blocks. It is just a thought, but it will keep you from executing your script for a long period of time.for instance create the script to copy the first $x number of rows thenhead("Location: ".$_SERVER['PHP_SELF']."?n=".$num); // $num is the beginning of your next block of x rowsThen you are back on the same page except now you have $_GET['num'] to tell you where to start to copy rows $num through $num + $x to your database.Make it continue to use head() until you run out of entries. Just make sure there is no output to the browser or you can't send headers. This is just one possibility that came to mind since your server may not allow you to extend your timeout on execution. Your server may specifically restrict it because of the obvious effects it has on the server. They may not like it. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.