I have several tab-separated files which I would like to use to update a large number of rows. So far I have naively created a temp table and imported the data into it and then updated the table which also involved a join. This created a large trans log. I would like to prevent this and do it in chunks. I have found this:
http://itknowledgeexchange.techtarget.com/itanswers/bulk-update-in-sql-server-2005/
Is this the best way?
Are there even faster methods (SSIS, bcp)?
Any feedback highly appreciated. Thanks.
Christian
PS:
My naive code looks like this. Unfort. i is not a primary key (the table has one as Bla is a child of another table)
drop table #temp
create table #temp
(
i int,
xml nvarchar(max)
)
BULK INSERT #temp
...
update a
set XML = b.xml
from Bla as a
inner join #temp as b on a.i = b.i