Jump to content


How to handle large volume of Inserts between two DB's

mysql database design

  • Please log in to reply
1 reply to this topic

#1 n1concepts

  • Members
  • PipPipPip
  • Advanced Member
  • 183 posts

Posted 14 February 2013 - 01:52 PM


I got a requirement where I need to define a MySQL database setup (using two databases if best which is reason for question).
Here's the objective:

1st (primary task): setup a MySQL db that will allow multiple websites (email capture forms) to log or save emails to that database (with minimum latency).
Note: form validation, per website, will ensure the emails are properly formatted so only MySQL INSERT required for this (1st) db - no other processing in this regard.

2nd (secondary task): That 1st MySQL db which will be under constant load - receiving emails from various sites (this hourly number could average between 10 to twenty-five thousand which is why I need to offline this task solely to one db <I think>). However, as new emails are logged to this (1st) db, I need to then batch those saved records over to a 2nd db which will be performing a series of functions - processing the leads along with number crunching (data mining, etc...)

Illustration the setup below: once record logged in DB1 (which is required to house ALL data - long-term; DB2 only housing a short-interval of incoming records - this to reduce processing on new entries).

<Objective: get records over to DB2 without hampering DB1 on incoming INSERTS from websites>
DB1 ===> DB2

(Objective: DB1 needs to pass records over to DB2 while still capturing incoming emails with the lowest possible latency for end-user).

Q: Should I setup transactions between DB1 & DB2 to pass the records into DB2 which - i think would induce latency back on DB1 b/c it will pause taking new emails while waiting for DB2 to accept or ignore new entries to that DB (FYI: DB2 will 'ignore' duplicate email entries).


Is there a better way to batch the records from DB1 over to DB2 in a way DB1 won't get bogged down with that process so it continues to receive new emails - whcih will be constant and in high volume.

Comments welcome and I will elaborate more if/when required to help understand the objective.

#2 n1concepts

  • Members
  • PipPipPip
  • Advanced Member
  • 183 posts

Posted 14 February 2013 - 01:59 PM

FYI: I just had a thought - still welcome comments and/or sugggestions - but think i see a way to make this less complicated:

Instead of haveing two (separate) Db's, being the 1st DB is only capturing incoming emails ONLY; then passing them to the 2nd DB which will be the actual work-horse.
I'm thinking only have one DB (period) and create a specific table that will simply log all incoming entries then I can query that table to process those records - marking each with a flag as the server process each.

This will cut out alot of 'transit' latency and reduce CPU processing right off the bat!
Let me know thoughts and if there is a better solution.


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users