Long story short; I am testing a system to purge entries from a table over a network connection, and the functionality is predicted to handle over 2 billion entries at most.
I need to stress test this to be certain.
Here's my test script (At best it's able to generate 9.8 million in ten minutes.)
DECLARE @I INT=0
WHILE @I <2000000001
BEGIN
INSERT INTO "Table here"
VALUES(@I)
SET @I=@I+1
END
Can anyone suggest anything, or give me an idea what the upper limits of my test environment might be in this situation?
Below is a method using CROSS JOIN in batches of 10M. This loaded 2 billion rows in about 6 minutes on my desktop machine.
SET NOCOUNT ON;
DECLARE
@TargetRowCount int = 2000000000
, @RowsInserted int = 0;
WHILE @RowsInserted < @TargetRowCount
BEGIN
WITH
t10 AS (SELECT n FROM (VALUES(0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) t(n))
,t1k AS (SELECT 0 AS n FROM t10 AS a CROSS JOIN t10 AS b CROSS JOIN t10 AS c)
,t10m AS (SELECT ROW_NUMBER() OVER (ORDER BY (SELECT 0)) AS num FROM t1k AS a
CROSS JOIN t1k AS b CROSS JOIN t10 AS c)
INSERT INTO dbo."table here" WITH(TABLOCKX)
SELECT num + @RowsInserted
FROM t10m;
SET @RowsInserted += @@ROWCOUNT;
RAISERROR('%d of %d rows inserted', 0, 0, @RowsInserted, @TargetRowCount) WITH NOWAIT;
END;
GO