Network_ninja Posted August 27, 2011 Share Posted August 27, 2011 hi everyone... What would I do or what will be the best approach if I have to extract a large data in a table.... Let's say that the query will extract 1 million records.. What will be the best way to extract it without affecting the performance of other systems... Let's just say that my table has a proper index... Tnx in advance.... Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/ Share on other sites More sharing options...
gizmola Posted August 27, 2011 Share Posted August 27, 2011 Your question is too vague to provide much of an answer. 1 million rows is going to be a lot of data, and it will take a while. Having an index or not is probably irrelevant unless this is a table with multi-millions of rows. You need to provide more information about what you mean by extract, and what you will be doing with all that data. Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1262511 Share on other sites More sharing options...
Network_ninja Posted August 27, 2011 Author Share Posted August 27, 2011 Ok... Lets make the numbers small... Lets assume that this would be my query SELECT fld1,fld2,fld3,fld4,fld5 FROM tablename WHERE datein < CURDATE() And will also assume that the query will return 30,000 rows... This will surely take a while rigth? I am looking if it will be possible in your code or mysql query that will pause a query and continue after 2 seconds.... Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1262547 Share on other sites More sharing options...
trq Posted August 27, 2011 Share Posted August 27, 2011 What exactly are you planning on doing with this 30,000 rows of data? Surely your not going to try and display it? Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1262554 Share on other sites More sharing options...
The Little Guy Posted August 29, 2011 Share Posted August 29, 2011 I have put 100k+ rows into a javascript array, so what is wrong with displaying 30k rows... If your table is optimized it should run fast. My page takes longer to render the page than it does to select the rows. We then use the javascript to search the items in the array, which searches even faster the the mysql database. The only problem we have is that our template system doesn't work well with it because the template system is very shitty. Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1263222 Share on other sites More sharing options...
fenway Posted August 29, 2011 Share Posted August 29, 2011 Who is going to read through 30K rows? Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1263299 Share on other sites More sharing options...
The Little Guy Posted August 29, 2011 Share Posted August 29, 2011 my members complained because they couldn't see their down-line, and their is a range from 1-100k members in someones down-line and they all complained because they couldn't see everyone in it. So we gave them the ability boom! 100k rows! you can then use javascript to filter what your looking for almost instantly by searching for either "member name" or by "member id number"! Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1263315 Share on other sites More sharing options...
trq Posted August 29, 2011 Share Posted August 29, 2011 my members complained because they couldn't see their down-line, and their is a range from 1-100k members in someones down-line and they all complained because they couldn't see everyone in it. So we gave them the ability boom! 100k rows! you can then use javascript to filter what your looking for almost instantly by searching for either "member name" or by "member id number"! This sounds like it would perfectly suite an ajax solution, something like what Facebook and Twitter do where as you forreach near the end of the page more data is dynamically fetched. Getting everything up front seems a massive waste of resources. Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1263347 Share on other sites More sharing options...
fenway Posted September 1, 2011 Share Posted September 1, 2011 You'll take your server down easily with that kind of unnecessary workload. Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1264287 Share on other sites More sharing options...
The Little Guy Posted September 1, 2011 Share Posted September 1, 2011 There really isn't. The good news is, the page doesn't get accessed much, and most members don't have that many people in their down-line. The page that returns 100k results only runs 20 queries and each one takes less than a second to return results. It builds the data in about 7 seconds, and then takes the browser 30ish seconds to download the data and render it. My code builds a 17MB javascript file for that member. Quote Link to comment https://forums.phpfreaks.com/topic/245795-extract-large-data/#findComment-1264372 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.