Jump to content

Recommended Posts

i have a spider taht i have written. right now it is using mysql database to insert links it finds. this number gets very big towards around a million per few sites. also, there is lot of inserting, updating, and deleting.

 

my question is, is this inefficient? is memory consumption too high if i do this ?

 

should i be using flat file DB (if so, can you guide me to a resource) instead?

Link to comment
https://forums.phpfreaks.com/topic/157293-mysql-or-flat-file-db/
Share on other sites

Not really. All depends on what you need to do with the results. Do you need to keep all results from all sites for reporting? If not why not after your spider has completed one site export and compress the database and then truncate it for the next.

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.