Higher speed options for executing very large (20

2019-07-18 00:08发布

My firm was delivered a 20+ GB .sql file in reponse to a request for data from the gov't. I don't have many options for getting the data in a different format, so I need options for how to import it in a reasonable amount of time. I'm running it on a high end server (Win 2008 64bit, MySQL 5.1) using Navicat's batch execution tool. It's been running for 14 hours and shows no signs of being near completion.

Does anyone know of any higher speed options for such a transaction? Or is this what I should expect given the large file size?

Thanks

4条回答
贼婆χ
2楼-- · 2019-07-18 00:18

I guess you mean it's a file produced by mysqldump as a backup of a database, so it contains mostly CREATE TABLE and INSERT statements.

(But strictly speaking, an SQL script can contain anything, such as definition and execution of long-running stored procedures, queries that result in deadlocks, etc. I'll assume this is not the case.)

Here are some things you can do to speed up restore, given that you have the backup file and can't change the type of file it is:

  1. Disable foreign key checks: SET FOREIGN_KEY_CHECKS=0 (remember to re-enable afterwards). Disable unique checks too: SET UNIQUE_CHECKS=0

  2. Make sure your key_buffer_size is set as large as possible if you use MyISAM tables. The default is 8MB, and the max is 4GB. I'd try 1GB.

    These first tips come from a post by Baron Schwartz: http://lists.mysql.com/mysql/206866

  3. Make sure your innodb_buffer_pool_size is set as large as possible if you use InnoDB tables. The default is 8MB, and the max is 4GB. I'd try 1GB.

  4. Set innodb_flush_log_at_trx_commit = 2 during the restore if you use InnoDB tables.

  5. @Mark B adds a good suggestion below to disable keys during a restore. This is how you do it:

    ALTER TABLE <table-name> DISABLE KEYS;
    ...run your restore...
    ALTER TABLE <table-name> ENABLE KEYS;
    

    But that command affects only one table at a time. You'll have to issue a separate command for each table. That said, it's often the case that one table is much larger than the other tables, so you may need to disable keys only for that one large table.

    Also, if the SQL script containing your restore drops and recreates tables, this would circumvent disabling keys. You'll have to find some way to insert the commands to disable keys after the table is created and before rows are inserted. You may need to get creative with sed to preprocess the SQL script before feeding it to the mysql client.

  6. Use the Percona Server version of mysqldump, with the --innodb-optimize-keys option.

查看更多
叛逆
3楼-- · 2019-07-18 00:20

Use BULK Import in MySQL for this.

查看更多
Summer. ? 凉城
4楼-- · 2019-07-18 00:20

Request just the table definitions, and the data in a .csv. Then do a bulk-import.

查看更多
家丑人穷心不美
5楼-- · 2019-07-18 00:24

There is a big amount of tools around, but I would recommend Navicat GUI to do that. In my experience it could run 48GB *.sql files in 6 hours on a host with 8GB of RAM.

Explanation (sort of) here: enter image description here Secondary click on chosen DB, select "Execute SQL File", choose the file, choose "continue on error" if you want to and finally run it. I know it shows a MySQL DB but works on most used/popular DBMS.

I seriously don't advise to "open" a file of such proportions in their sql query builder, it would block the machine as the RAM would be at full capacity over and over again.

This also works on Macintosh OS as host of Navicat application and once you are connected to a given DB Server you can run it wherever you want, working pretty well on RHEL, Ubuntu Server, Debian and Windows Server until now.

查看更多
登录 后发表回答