I've got about 100 million rows that I'm moving in SSIS 2008 via a Data Flow Task. It's pretty much a straight table data copy using a Multicast. My question is this:
Using the OLE DB Destination Editor I have two options: Rows per batch and Maximum insert commit size. What are good settings for this? I've only been able to find that you are recommended to set Maximum insert commit size to 2147483647 instead of 0, but then tweak both these settings based on testing. I'm curious to find out if anyone has discovered anything useful in their own management of these values.
There is no best value, it depends greatly on the design of the database, the number of users, the kind of hardware you are operating one etc. THat is why you need to test for yourself with your system.
I find this useful to guide me: Top 10 SQL Server Integration Services Best Practices
Simply because I don't use SSIS enough. However, as HLGEM said, you'll just have to give a shot yourself to see what happens...
If you leave the settings that are default, you will have to insert all 100 million rows before the batch is committed which can cause your transaction log to grow very large. If you want to keep that down, pick a number lower than the default.
I found, If you are on a simple or bulk load database, and have truncated the table, or dropped your indexes, and if you select fast load option, then it should not be logging the rows as inserts and no impact will be felt by the transaction log. And these settings should be left as is with a single commit on the end and index rebuilding will only be done one time at the end.