The faster method to move redis data to MySQL

2019-03-17 01:50发布

We have big shopping and product dealing system. We have faced lots problem with MySQL so after few r&D we planned to use Redis and we start integrating Redis in our system. Following this previously directly hitting the database now we have moved the Redis system

  1. User shopping cart details
  2. Affiliates clicks tracking records
  3. We have product dealing user data.
  4. other site stats.

I am not only storing the data in Redis system i have written crons which moves Redis data in MySQL data at time intervals. This is the main point i am facing the issues. Bellow points i am looking for solution

  1. Is their any other ways to dump big data from Redis to MySQL?
  2. Redis fail our store data in file so is it possible to store that data directly to MySQL database?
  3. Is Redis have any trigger system using that i can avoid the crons like queue system?

1条回答
闹够了就滚
2楼-- · 2019-03-17 02:34

Is their any other way to dump big data from Redis to MySQL?

Redis has the possibility (using bgsave) to generate a dump of the data in a non blocking and consistent way.

https://github.com/sripathikrishnan/redis-rdb-tools

You could use Sripathi Krishnan's well-known package to parse a redis dump file (RDB) in Python, and populate the MySQL instance offline. Or you can convert the Redis dump to JSON format, and write scripts in any language you want to populate MySQL.

This solution is only interesting if you want to copy the complete data of the Redis instance into MySQL.

Does Redis have any trigger system that i can use to avoid the crons like queue system?

Redis has no trigger concept, but nothing prevents you to post events in Redis queues each time something must be copied to MySQL. For instance, instead of:

# Add an item to a user shopping cart
RPUSH user:<id>:cart <item>

you could execute:

# Add an item to a user shopping cart
MULTI
RPUSH user:<id>:cart <item>
RPUSH cart_to_mysql <id>:<item>
EXEC

The MULTI/EXEC block makes it atomic and consistent. Then you just have to write a little daemon waiting on items of the cart_to_mysql queue (using BLPOP commands). For each dequeued item, the daemon has to fetch the relevant data from Redis, and populate the MySQL instance.

Redis fail our store data in file so is it possible to store that data directly to MySQL database?

I'm not sure I understand the question here. But if you use the above solution, the latency between Redis updates and MySQL updates will be quite limited. So if Redis fails, you will only loose the very last operations (contrary to a solution based on cron jobs). It is of course not possible to have 100% consistency in the propagation of data though.

查看更多
登录 后发表回答