Update query on millions of rows fills the transac

2019-08-25 12:10发布

I need to update millions of rows as part of my next release, but doing so fills the transaction log and fails. I have a few ideas but I'm not a SQL expert so I'm sure there will be gotchas that I'm not aware of.

Pertinent points:

  1. I need to hand a script over to the operations team so need a T-SQL method with no manual intervention.
  2. Apparently the transaction log gets recycled every 15 minutes. (I've thought about writing a loop with a try-catch with WAITFOR DELAY '00:15:00' in the catch block like below)
  3. (EDIT) I can't modify anything except the data.
  4. (EDIT) It's a simple update changing a foreign key column to a different existing key.

Thanks,

Phil

DECLARE
    @AffectedRows int

SET @AffectedRows = 0

WHILE @AffectedRows < @RowsToUpdate
BEGIN
    BEGIN TRY
        BEGIN TRAN
        -- Do some updates  
        SET @AffectedRows = @AffectedRows + @@RowCount
        COMMIT TRAN
    END TRY
    BEGIN CATCH
        PRINT ERROR_MESSAGE()
        WAITFOR DELAY '00:15:00'
    END CATCH
END

PRINT @AffectedRows

3条回答
趁早两清
2楼-- · 2019-08-25 12:23

You're reinventing nibbling deletes/updates :)

Take a look at this approach, you can do bigger blocks than a single row:

http://www.sqlservervideos.com/video/nibbling-deletes/

http://sqladvice.com/blogs/repeatableread/archive/2005/09/20/12795.aspx

查看更多
对你真心纯属浪费
3楼-- · 2019-08-25 12:35

In the end the example I had already written worked best; a transaction log full error gets caught in the catch and 15 minutes is long enough for the log to be recycled.

DECLARE 
    @AffectedRows int 

SET @AffectedRows = 0 

WHILE @AffectedRows < @RowsToUpdate 
BEGIN 
    BEGIN TRY 
        BEGIN TRAN 
        -- Do some updates   
        SET @AffectedRows = @AffectedRows + @@RowCount 
        COMMIT TRAN 
    END TRY 
    BEGIN CATCH 
        PRINT ERROR_MESSAGE() 
        WAITFOR DELAY '00:15:00' 
    END CATCH 
END 

PRINT @AffectedRows
查看更多
在下西门庆
4楼-- · 2019-08-25 12:35

A few points / ideas:

  1. You can expand your transaction log to whatever size you want so it does not fill it.
  2. If your transaction log grows too much you can always backup your DB and truncate the log.
  3. You can work through the data in batches (do a million at a time)
  4. You can copy the data to a working table and then sp_rename it in when the processing is done.
查看更多
登录 后发表回答