Best practices for daily MySQL (partial and filter

2019-05-28 19:44发布

问题:

I have a reasonable large database with > 40 tables. I only need to replicated a few tables (+/- 5). And each tables is also filtered.

I'm looking for some best practices for replicating this data (daily is enough), where i can select just a few tables and included some WHERE clauses for each table.

I'm thinking of starting mysqldump for each table (with a where clause) and make for each table a separate .sql file. I then can then truncate all tables (all data is daily overwritten) on the destination db, and run mysql for importing each table separate.

Example:

   # dump each table 
   mysqldump -u repl_user my_database my_table -w 'id between 1000 and 1005' > my_table.sql

Im aware of replicating a full database and use blackhole table type. But since 35 tables are not needed, it seems somehow overkill. Besides, some tables only needs a filtered version, and that i can't solve via blackhole.

Any better solutions?

回答1:

MySQL natively supports replication filters, but only at the database or table level. This doesn't meet your requirement to filter a subset of rows from these tables.

FlexViews is a tool to read the binary log and replay only changes that are relevant to keeping a materialized view up to date. You could define your materialized view in such a way to implement your table filtering.