Efficient Multiple SQL insertion

2019-05-02 06:16发布

What is the best/most time efficient way to insert 1000 rows into one table (jdbc/connector-mysql database)? (it is a buffer and need to be dumped into the database everytime it is full)

1- One auto generated/concanated SQL statement?

2- for (int i = 0; i<1000; i++) { con.prepareStatement(s.get(i)); } con.commit();

3- stored procedure ?

4- bulk data insertion via a file?

5- (your solution)

3条回答
地球回转人心会变
2楼-- · 2019-05-02 06:29

Note that JDBC auto-commits after each statement, unless you explicitly tell it not to. Doing so and running X inserts before running commit, should improve your performance quite a bit.

查看更多
Ridiculous、
3楼-- · 2019-05-02 06:36

The LOAD DATA INFILE statement is probably your best bet for performance. (#4 from your list of options above) Though it will probably take more code to accomplish your task since you need to create the intermediate file, get it to the server and then call LOAD DATA to get it into your db.

MySql Help pages quote:

The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed.

查看更多
来,给爷笑一个
4楼-- · 2019-05-02 06:36

You can use JDBC batch inserts.

PreparedStatement ps = cn.prepareStatement("INSERT INTO....");

for (int i = 0; i < 1000; i++) {
    ps.setString(1, "value-" + i); // Set values
    ps.addBatch();                 // Add this to the batch
    ps.clearParameters();          // Reset the parameters
}

ps.executeBatch();

However MySQL's does not support batch insert functionality natively, so to get reasonable performance you will need to add rewriteBatchedStatements=true to your mysql jdbc configuration.

There is also a MySQL specific batch insert syntax that you could use if you want to create a big SQL string. I would try both and test the performance.

INSERT INTO x (a,b)
     VALUES ('1', 'one'),
            ('2', 'two'),
            ('3', 'three');

Be careful with both approaches as you may exceed the maximum packet size for a single request. You may need to set a batch size that is different to the number of entries in your buffer. E.g. you may need to split 1000 entries across 10 batches of 100.

查看更多
登录 后发表回答