I am migrating my program from Microsoft SQL Server to MySQL. Everything works well except one issue with bulk copy.
In the solution with MS SQL the code looks like this:
connection.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
bulkCopy.DestinationTableName = "testTable";
bulkCopy.WriteToServer(rawData);
Now I try to do something similar for MySQL. Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader
class.
Any help would be highly appreciated.
Don't rule out a possible solution based on unfounded assumptions. I just tested the insertion of 100,000 rows from a
System.Data.DataTable
into a MySQL table using a standardMySqlDataAdapter#Update()
inside aTransaction
. It consistently took about 30 seconds to run:(I tried a couple of different values for
UpdateBatchSize
but they didn't seem to have a significant impact on the elapsed time.)By contrast, the following code using
MySqlBulkLoader
took only 5 or 6 seconds to run ...... including the time to dump the 100,000 rows from the DataTable to a temporary CSV file (using code similar to this), bulk-loading from that file, and deleting the file afterwards.
Using any of BulkOperation NuGet-package, you can easily have this done.
Here is an example using the package from https://www.nuget.org/packages/Z.BulkOperations/2.14.3/
......
instead of using
just following two lines
will take only 5 seconds to copy the whole DataTable into MySQL without first dumping the 100,000 rows from the DataTable to a temporary CSV file.