Bulk copy a DataTable into MySQL (similar to Syste

2020-01-28 08:26发布

I am migrating my program from Microsoft SQL Server to MySQL. Everything works well except one issue with bulk copy.

In the solution with MS SQL the code looks like this:

connection.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
bulkCopy.DestinationTableName = "testTable";
bulkCopy.WriteToServer(rawData);

Now I try to do something similar for MySQL. Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.

Any help would be highly appreciated.

2条回答
仙女界的扛把子
2楼-- · 2020-01-28 08:47

Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.

Don't rule out a possible solution based on unfounded assumptions. I just tested the insertion of 100,000 rows from a System.Data.DataTable into a MySQL table using a standard MySqlDataAdapter#Update() inside a Transaction. It consistently took about 30 seconds to run:

using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable))
{
    using (MySqlCommand cmd = new MySqlCommand())
    {
        cmd.Connection = conn;
        cmd.Transaction = tran;
        cmd.CommandText = "SELECT * FROM testtable";
        using (MySqlDataAdapter da = new MySqlDataAdapter(cmd))
        {
            da.UpdateBatchSize = 1000;
            using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da))
            {
                da.Update(rawData);
                tran.Commit();
            }
        }
    }
}

(I tried a couple of different values for UpdateBatchSize but they didn't seem to have a significant impact on the elapsed time.)

By contrast, the following code using MySqlBulkLoader took only 5 or 6 seconds to run ...

string tempCsvFileSpec = @"C:\Users\Gord\Desktop\dump.csv";
using (StreamWriter writer = new StreamWriter(tempCsvFileSpec))
{
    Rfc4180Writer.WriteDataTable(rawData, writer, false);
}
var msbl = new MySqlBulkLoader(conn);
msbl.TableName = "testtable";
msbl.FileName = tempCsvFileSpec;
msbl.FieldTerminator = ",";
msbl.FieldQuotationCharacter = '"';
msbl.Load();
System.IO.File.Delete(tempCsvFileSpec);

... including the time to dump the 100,000 rows from the DataTable to a temporary CSV file (using code similar to this), bulk-loading from that file, and deleting the file afterwards.

查看更多
在下西门庆
3楼-- · 2020-01-28 08:47

Using any of BulkOperation NuGet-package, you can easily have this done.

Here is an example using the package from https://www.nuget.org/packages/Z.BulkOperations/2.14.3/

using Z.BulkOperations;

......

MySqlConnection conn = DbConnection.OpenConnection();
DataTable dt = new DataTable("testtable");
MySqlDataAdapter da = new MySqlDataAdapter("SELECT * FROM testtable", conn);
MySqlCommandBuilder cb = new MySqlCommandBuilder(da);
da.Fill(dt);

instead of using

......
da.UpdateBatchSize = 1000;
......
da.Update(dt)

just following two lines

var bulk = new BulkOperation(conn);
bulk.BulkInsert(dt);

will take only 5 seconds to copy the whole DataTable into MySQL without first dumping the 100,000 rows from the DataTable to a temporary CSV file.

查看更多
登录 后发表回答