time out problem PLEASE HELP!
Posted 21 July 2006 - 02:03 PM
I've written programming which reads from a file and then inputs that data into a SQL database. I use a two dimentional array to hold all my data. the script works perfectly with small test sections of the file (about 9000 lines), but when i execute it with the entire file (almost 200,000 lines), I get a time-out error. I increased the size and time restrictions, so now instead of a time-out I get an internal server error after about 15 minutes. any ideas? Is the 2-D array slowing it down perhaps?
Posted 21 July 2006 - 02:17 PM
Have you tried breaking it into smaller files and passing everything around? Or using functions? Or any number of things to make the script a LOT smaller?
Posted 24 July 2006 - 01:11 PM
I'm trying to reprogram it so it doesn't use 2-d arrays. I've made some of the algorithms a lot simpler too. and I will write some functions in, those are just like procedures right?
Thanks a lot!!!
Posted 24 July 2006 - 01:44 PM
Posted 24 July 2006 - 01:47 PM
Posted 26 July 2006 - 03:58 PM
Thanks for all the help!
Posted 26 July 2006 - 04:52 PM
-The best way is to loop through the data uploaded, creating one huge sql string in the middle of the loop, and then when the loop terminates, submit the whole thing to mysql at once. That will run way faster!
I had a similar thing a while ago. I had a file containing all zip codes in holland (over a 100.000 entries) and had to enter them into a table. Looping through the file, inserting every line, took several hours.
I didn't know you could do multiple inserts in one query.
INSERT INTO beautiful (name, age) VALUES ('Helen', 24), ('Katrina', 21), ('Samia', 22), ('Hui Ling', 25), ('Yumie', 29);
I'm not that sure it'll be that much faster though.
Another lesson learned.
Posted 26 July 2006 - 04:57 PM
or [php] * 1000... PLEASE READ THE POSTED SOLUTIONS CAREFULLY * 1000000...
Posted 26 July 2006 - 06:40 PM
you could also write a file in a certain format, then have mysql read it with LOAD DATA INFILE and such. it's a very fast process. i've been using this method for awhile, and it can read 200k rows in two minutes.
It comes a bit late for me, but that is absolutaly great. Might come in handy again sometime.
Thanks for that! Very "cool".
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users