OM2 Posted December 19, 2013 Share Posted December 19, 2013 I'm writing a spec for some coding I need to have done I need to have a CSV file read in and parsed The data in total would be 200kb - with 500 - 600 rows Is there any problem in storing the data in memory and processing The some of the data in each row needs to be parsed and cleaned Or should I just be storing into a table and then processing? Thanks Omar Quote Link to comment Share on other sites More sharing options...
KevinM1 Posted December 19, 2013 Share Posted December 19, 2013 ...or, you could break it up into chunks. Read a certain number of lines, process, read more, etc. Quote Link to comment Share on other sites More sharing options...
scootstah Posted December 19, 2013 Share Posted December 19, 2013 No, 200KB won't really cause much problems. Obviously if n * 200 (where n = number of concurrent users) is greater than the RAM you have available, you'll start having issues. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.