How to optimize MySQL to insert millions row?

2019-01-22 04:49发布

I need to insert millions rows into the MySQL database (InnoDB engine). I have a problem with time when the tables have big sizes. Almost all time is spent on insert queries. Maybe somebody know how to optimize it?

2条回答
放荡不羁爱自由
2楼-- · 2019-01-22 05:22

To import large bulk of data into InnoDB:

  1. set in MySQL configuration

    • innodb_doublewrite = 0
    • innodb_buffer_pool_size = 50%+ system memory
    • innodb_log_file_size = 512M
    • log-bin = 0
    • innodb_support_xa = 0
    • innodb_flush_log_at_trx_commit = 0
  2. Add right after transaction start:

    SET FOREIGN_KEY_CHECKS = 0;

    SET UNIQUE_CHECKS = 0;

    SET AUTOCOMMIT = 0;

  3. Set right before transaction end:

    SET UNIQUE_CHECKS = 1;

    SET FOREIGN_KEY_CHECKS = 1;

查看更多
Melony?
3楼-- · 2019-01-22 05:25

If your talking about a large number of INSERT statements, look into something called transactions. I'm quite certain that most (if not all) languages that do SQL support transactions. They will speed up anything involving writing to the DB. An added bonus is that if something goes wrong you can rollback the changes.

查看更多
登录 后发表回答