I am reading from a csv file and sending data as table variable to a stored procedure. From what i have tested so far , I am able to process 300k records in 3 mins 30 seconds . The file may contain up to millions of records as we go. I wanted to know if its a good idea to send all these records to the stored procedure in one go or Should I send them in batches of say 500k? I have set the command timeout to 1800.
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
An example of using IEnumerable SqlDataRecord
It works kind of like a reverse datareader
Notice I sort. This is by the clustered index. Fragmentation of the indexes will absolutely kill load speed. The first implementation used Insert Values (unsorted) and in a 12 hour run this version is literally 100x faster. I also disable indexes other than the PK and reindex at the end of the load. In a long run I am getting about 500 rows / second. Your sample is 1400 / second so great. If you start to see degradation then things to look at.
public class DocFTSinXsCollection : List<DocFTSinX>, IEnumerable<SqlDataRecord>
{
// used by TVP for fast insert
private int sID;
private IEnumerable<DocFTSinX> docFTSinXs;
IEnumerator<SqlDataRecord> IEnumerable<SqlDataRecord>.GetEnumerator()
{
//todo fix the order in 3 to sID, wordID1, workID2
var sdr = new SqlDataRecord(
new SqlMetaData("wordID1", System.Data.SqlDbType.Int),
new SqlMetaData("wordID2", System.Data.SqlDbType.Int),
new SqlMetaData("sID", System.Data.SqlDbType.Int),
new SqlMetaData("Delta", System.Data.SqlDbType.Int));
foreach (DocFTSinX oh in docFTSinXs.OrderBy(x => x.Word1).ThenBy(x => x.Word2))
{
sdr.SetInt32(0, oh.Word1);
sdr.SetInt32(1, oh.Word2);
sdr.SetInt32(2, sID);
sdr.SetInt32(3, (Int32)oh.Delta);
yield return sdr;
}
}
public DocFTSinXsCollection(int SID, IEnumerable<DocFTSinX> DocFTSinXs)
{
sID = SID;
docFTSinXs = DocFTSinXs;
//Debug.WriteLine("DocFTSinXsCollection DocFTSinXs " + DocFTSinXs.Count().ToString());
}
}
Other tools to consider are the SQLBulkCopy .NET class and Drapper.
OP asked how to perform in batches.
while (true)
{
// if no more break;
// fill list or datatable with next 100000
// send list or datatable to db
}