fabianCastle Posted March 14, 2009 Share Posted March 14, 2009 Hi All, thanks in advanced for your help !!. I have written an script that import a CSV into a database, while doing so, it basically instantiate domain classes and does other stuff. As you probably guested, it expires on me when the amount of data that I need to process is excessive. I've thought of a couple ways to fix this, but I would greatly appreciate your help in this solution: 1. Extend the session timeout, (the simplest, yet it shows no activity to the user and I would like to avoid this). 2. Make the page call itself every X amount CSC lines processed, and show the process statically each call. 3. Use ajax somehow dynamically show the progress while it processes the CSV. This should allow me to show summary information as well. ( I would prefer this one). Which one would you suggest? Is there any other way to solve this? Any programming patterns that I should be reading about? Links to similar resolutions? Thanks in advanced for your help on this.. Fabian Quote Link to comment https://forums.phpfreaks.com/topic/149438-how-to-distribute-work-among-various-pages-to-avoid-long-excecutionexpiration/ Share on other sites More sharing options...
shlumph Posted March 15, 2009 Share Posted March 15, 2009 Is the CSV updated frequently, and that's why it's so large? Why not update the information directly to the database if this is the case? Is the CSV a large file, that isn't going to be updated? Then why not put the file on your local machine (assuming it's more powerful than the server), and use navicat, or any other MySQL GUI to import the CSV into a database, and then upload the database to your online sever? Hope this helps. Let me know if these are not the cases... Quote Link to comment https://forums.phpfreaks.com/topic/149438-how-to-distribute-work-among-various-pages-to-avoid-long-excecutionexpiration/#findComment-784903 Share on other sites More sharing options...
fabianCastle Posted March 15, 2009 Author Share Posted March 15, 2009 shlumph, thanks a lot for the help. Here are the answers to your questions: Yes, the CSV is large and needs to be updated frequently. It's is large because it contains daily operation data from many different branches/affiliates. These CSV files are generated by a third party system we don't have control over. It is not possible to update directly to the database because it needs to be pre-processed and modified in same cases. (Although, i'm sure we'll end up integrating everything toguether, right now I just need to process them). I can not upload the information by using a GUI, the end user must do that thru a web interface. Alhtough, providing this answers might help to extend further the understanding of the problem, I would like to focus more on the programming issue I have. In addition to my original post, I read very little about two ajax patterns that might facilitate a solution: # Call Tracking Accommodate busy user behaviour by allocating a new XMLHttpRequest object for each request. See Richard Schwartz's blog entry.Note: Pending some rewrite to take into account request-locking etc. # Periodic Refresh The browser refreshes volatile information by periodically polling the server. found here: http://ajaxpatterns.org/Patterns I will continue reading about them. Thanks to all again Fabian Quote Link to comment https://forums.phpfreaks.com/topic/149438-how-to-distribute-work-among-various-pages-to-avoid-long-excecutionexpiration/#findComment-785011 Share on other sites More sharing options...
fabianCastle Posted March 16, 2009 Author Share Posted March 16, 2009 Any ideas ? Quote Link to comment https://forums.phpfreaks.com/topic/149438-how-to-distribute-work-among-various-pages-to-avoid-long-excecutionexpiration/#findComment-785867 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.