I have a reasonable large database with > 40 tables. I only need to replicated a few tables (+/- 5). And each tables is also filtered.
I'm looking for some best practices for replicating this data (daily is enough), where i can select just a few tables and included some WHERE clauses for each table.
I'm thinking of starting mysqldump for each table (with a where clause) and make for each table a separate .sql file. I then can then truncate all tables (all data is daily overwritten) on the destination db, and run mysql for importing each table separate.
Example:
# dump each table
mysqldump -u repl_user my_database my_table -w 'id between 1000 and 1005' > my_table.sql
Im aware of replicating a full database and use blackhole table type. But since 35 tables are not needed, it seems somehow overkill. Besides, some tables only needs a filtered version, and that i can't solve via blackhole.
Any better solutions?