I am creating a script that will be run in a MS SQL server. This script will run multiple statements and needs to be transactional, if one of the statement fails the overall execution is stopped and any changes are rolled back.
I am having trouble creating this transactional model when issuing ALTER TABLE statements to add columns to a table and then updating the newly added column. In order to access the newly added column right away, I use a GO command to execute the ALTER TABLE statement, and then call my UPDATE statement. The problem I am facing is that I cannot issue a GO command inside an IF statement. The IF statement is important within my transactional model. This is a sample code of the script I am trying to run. Also notice that issuing a GO command, will discard the @errorCode variable, and will need to be declared down in the code before being used (This is not in the code below).
BEGIN TRANSACTION
DECLARE @errorCode INT
SET @errorCode = @@ERROR
-- **********************************
-- * Settings
-- **********************************
IF @errorCode = 0
BEGIN
BEGIN TRY
ALTER TABLE Color ADD [CodeID] [uniqueidentifier] NOT NULL DEFAULT ('{00000000-0000-0000-0000-000000000000}')
GO
END TRY
BEGIN CATCH
SET @errorCode = @@ERROR
END CATCH
END
IF @errorCode = 0
BEGIN
BEGIN TRY
UPDATE Color
SET CodeID= 'B6D266DC-B305-4153-A7AB-9109962255FC'
WHERE [Name] = 'Red'
END TRY
BEGIN CATCH
SET @errorCode = @@ERROR
END CATCH
END
-- **********************************
-- * Check @errorCode to issue a COMMIT or a ROLLBACK
-- **********************************
IF @errorCode = 0
BEGIN
COMMIT
PRINT 'Success'
END
ELSE
BEGIN
ROLLBACK
PRINT 'Failure'
END
So what I would like to know is how to go around this problem, issuing ALTER TABLE statements to add a column and then updating that column, all within a script executing as a transactional unit.
GO is not a T-SQL command. Is a batch delimiter. The client tool (SSM, sqlcmd, osql etc) uses it to effectively cut the file at each GO and send to the server the individual batches. So obviously you cannot use GO inside IF, nor can you expect variables to span scope across batches.
Also, you cannot catch exceptions without checking for the
XACT_STATE()
to ensure the transaction is not doomed.Using GUIDs for IDs is always at least suspicious.
Using NOT NULL constraints and providing a default 'guid' like
'{00000000-0000-0000-0000-000000000000}'
also cannot be correct.Updated:
XACT_ABORT
to force error to interrupt the batch. This is frequently used in maintenance scripts (schema changes). Stored procedures and application logic scripts in general use TRY-CATCH blocks instead, but with proper care: Exception handling and nested transactions.example script:
Only a successful script will reach the
COMMIT
. Any error will abort the script and rollback.I used
COLUMNPROPERTY
to check for column existance, you could use any method you like instead (eg. lookupsys.columns
).Orthogonal to Remus's comments, what you can do is execute the update in an sp_executesql.
We've needed to do this when creating upgrade scripts. Usually we just use GO but it has been necessary to do things conditionally.
Another alternative, if you don't want to split the code into separate batches, is to use EXEC to create a nested scope/batch as here
Have you tried it without the GO?
Normally you should not mix table changes and data changes in the same script.
I think you can use a ";" to terminate and execute eachn individual command, rather than GO.
Note that GO is not part of Transact-SQL:
http://msdn.microsoft.com/en-us/library/ms188037.aspx
I almost agree with Remus but you can do this with SET XACT_ABORT ON and XACT_STATE
Basically
Tools like Red Gate SQL Compare use this technique
Something like:
I've also removed the default. No value = NULL for GUID values. It's meant to be unique: don't try and set every row to all zeros because it will end in tears...