I am reading from a csv file and sending data as table variable to a stored procedure. From what i have tested so far , I am able to process 300k records in 3 mins 30 seconds . The file may contain up to millions of records as we go. I wanted to know if its a good idea to send all these records to the stored procedure in one go or Should I send them in batches of say 500k? I have set the command timeout to 1800.
相关问题
- Generic Generics in Managed C++
- How to Debug/Register a Permanent WMI Event Which
- What is the best way to cache a table from a (SQL)
- 'System.Threading.ThreadAbortException' in
- Bulk update SQL Server C#
An example of using IEnumerable SqlDataRecord
It works kind of like a reverse datareader
Notice I sort. This is by the clustered index. Fragmentation of the indexes will absolutely kill load speed. The first implementation used Insert Values (unsorted) and in a 12 hour run this version is literally 100x faster. I also disable indexes other than the PK and reindex at the end of the load. In a long run I am getting about 500 rows / second. Your sample is 1400 / second so great. If you start to see degradation then things to look at.
Other tools to consider are the SQLBulkCopy .NET class and Drapper.
OP asked how to perform in batches.