nuxy Posted November 23, 2007 Share Posted November 23, 2007 I'm makigna rather big database with some information. I have one table at the moment, expected to be round about 5 more in the future. I have about 11 milion rows in my single table right now, expected to be round about 100 - 500 million. My tables are in a structured way, to further increase optimation, only two columns. My tables basically stores words/letters and similar things.. My problem is that working with these large tables/databases is not the fact that it takes allot of disk space, but the actual load time. At this point, it is taking me over 60 seconds just to query how many rows there is, this is in my application(php). In phpmyamin, it takes about 2 seconds, and it get allot more information at the same time. Why is this, and how can I decrease the loadtime? Here is what I know sofar; mysql_num_rows(mysql_query("SELECT `column` FROM `table`")); Is slower than.. mysql_result(mysql_query("SELECT `column` FROM `table`"),0); You cannot limit the ammount selected, it gives a false reading. You cannot increase the php max memory, it will not affect the actually execution of the script. Is there anyone who can tell me how phpmyadmin does this, I have looked at the source code but cannot find this. Any ideas? Quote Link to comment https://forums.phpfreaks.com/topic/78576-working-with-large-data-quantities/ Share on other sites More sharing options...
nuxy Posted November 23, 2007 Author Share Posted November 23, 2007 *Bump* Quote Link to comment https://forums.phpfreaks.com/topic/78576-working-with-large-data-quantities/#findComment-397738 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.