I'm writing a process that archives rows from a SQL Server table based on a datetime column. I want to move all the rows with a date before X, but the problem is that there are millions of rows for each date, so doing a BEGIN TRANSACTION...INSERT...DELETE...COMMIT for each date takes too long, and locks up the database for other users.
Is there a way that I can do it in smaller chunks? Maybe using ROWCOUNT or something like that?
I'd originally considered something like this:
SET ROWCOUNT 1000
DECLARE @RowsLeft DATETIME
DECLARE @ArchiveDate DATETIME
SET @ROWSLEFT = (SELECT TOP 1 dtcol FROM Events WHERE dtcol <= @ArchiveDate)
WHILE @ROWSLEFT IS NOT NULL
BEGIN
INSERT INTO EventsBackups
SELECT top 1000 * FROM Events
DELETE Events
SET @ROWSLEFT = (SELECT TOP 1 dtcol FROM Events WHERE dtcol <= @ArchiveDate)
END
But then I realized that I can't guarantee that the rows I'm deleting are the ones I just backed up. Or can I...?
UPDATE: Another options I'd considered was adding a step:
- SELECT TOP 1000 rows that meet my date criteria into a temp table
- Begin Transaction
- Insert from temp table into archive table
- Delete from source table, joining to temp table across every column
- Commit transaction
- Repeat 1-5 until no rows remain that meet the date criteria
Does anybody have an idea for how the expense of this series might compare to some of the other options discussed below?
DETAIL: I'm using SQL 2005, since somebody asked.
How about:
Here's what I ended up doing:
I'm not pulling exactly 1000, just 1000ish, so it handles repeats in the time column appropriately (something I worried about when I considered using ROWCOUNT). Since there are often repeats in the time column, I see it regularly move 1002 or 1004 rows/iteration, so I know it's getting everything.
I'm submitting this as an answer so it can be judged up against the other solutions people have provided. Let me know if there's something obviously wrong with this method. Thanks for your help, everybody, and I'll accept whichever answer has the most votes in a few days.
Have you got an index on the datefield? If you haven't sql may be forced to upgrade to a table lock which will lock out all your users while your archive statements run.
I think you will need an index for this operation to perform at all well! Put an index on your date field and try your operation again!
Could you make a copy of Events, move all rows with dates >= x to that, drop Events and rename the copy Events? Or copy, truncate and then copy back? If you can afford a little downtime this would probably be the quickest approach.
Another option would be to add a trigger procedure to the Events table that does nothing but add the same record to the EventsBackup table.
That way the EventsBackup is always up to date, and all you do is periodically purge records from your Events table.
Just INSERT the result of the DELETE:
This is atomic and consistent.