jjk2 Posted May 7, 2009 Share Posted May 7, 2009 i have a spider taht i have written. right now it is using mysql database to insert links it finds. this number gets very big towards around a million per few sites. also, there is lot of inserting, updating, and deleting. my question is, is this inefficient? is memory consumption too high if i do this ? should i be using flat file DB (if so, can you guide me to a resource) instead? Quote Link to comment https://forums.phpfreaks.com/topic/157293-mysql-or-flat-file-db/ Share on other sites More sharing options...
fenway Posted May 8, 2009 Share Posted May 8, 2009 There's a board that discusses flat file Quote Link to comment https://forums.phpfreaks.com/topic/157293-mysql-or-flat-file-db/#findComment-829565 Share on other sites More sharing options...
JonnoTheDev Posted May 8, 2009 Share Posted May 8, 2009 Not really. All depends on what you need to do with the results. Do you need to keep all results from all sites for reporting? If not why not after your spider has completed one site export and compress the database and then truncate it for the next. Quote Link to comment https://forums.phpfreaks.com/topic/157293-mysql-or-flat-file-db/#findComment-829576 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.