I am aware of COLUMNS_UPDATED
, well I need some quick shortcut (if anyone has made, I am already making one, but if anyone can save my time, I will appriciate it)
I need basicaly an XML of only updated column values, I need this for replication purpose.
SELECT * FROM inserted gives me each column, but I need only updated ones.
something like following...
CREATE TRIGGER DBCustomers_Insert
ON DBCustomers
AFTER UPDATE
AS
BEGIN
DECLARE @sql as NVARCHAR(1024);
SET @sql = 'SELECT ';
I NEED HELP FOR FOLLOWING LINE ...., I can manually write every column, but I need
an automated routin which can work regardless of column specification
for each column, if its modified append $sql = ',' + columnname...
SET @sql = $sql + ' FROM inserted FOR XML RAW';
DECLARE @x as XML;
SET @x = CAST(EXEC(@sql) AS XML);
.. use @x
END
The below code works for over 64 columns and logs only the updated columns. Follow the instruction in the comments and all should be well.
Inside the trigger, you can use
COLUMNS_UPDATED()
like this in order to get updated valueBut this snipet of code fails when you have a table with more than 62 columns.. Arth.Overflow...
Here is the final version which handles more than 62 columns but give only the number of the updated columns. It's easy to link with 'syscolumns' to get the name
The sample code provided by Rick lack handling for multiple rows update.
Please let me enhance Rick's version as below:
The only way that occurs to me that you could accomplish this without hard coding column names would be to drop the contents of the deleted table to a temp table, then build a query based on the table definition to to compare the contents of your temp table and the actual table, and return a delimited column list based on whether they do or do not match. Admittedly, the below is elaborate.
I've another completely different solution that doesn't use COLUMNS_UPDATED at all, nor does it rely on building dynamic SQL at runtime. (You might want to use dynamic SQL at design time but thats another story.)
Basically you start with the inserted and deleted tables, unpivot each of them so you are just left with the unique key, field value and field name columns for each. Then you join the two and filter for anything that's changed.
Here is a full working example, including some test calls to show what is logged.
So no messing around with bigint bitfields and arth overflow problems. If you know the columns you want to compare at design time then you don't need any dynamic SQL.
On the downside the output is in a different format and all the field values are converted to sql_variant, the first could be fixed by pivoting the output again, and the second could be fixed by recasting back to the required types based on your knowledge of the design of the table, but both of these would require some complex dynamic sql. Both of these might not be an issue in your XML output. This question does something similar to getting the output back in the same format.
Edit: Reviewing the comments below, if you have a natural primary key that could change then you can still use this method. You just need to add a column that is populated by default with a GUID using the NEWID() function. You then use this column in place of the primary key.
You may want to add an index to this field, but as the deleted and inserted tables in a trigger are in memory it might not get used and may have a negative effect on performance.
I've done it as simple "one-liner". Without using, pivot, loops, many variables etc. that makes it looking like procedural programming. SQL should be used to process data sets :-), the solution is:
It uses COLUMNS_UPDATED, takes care of more than eight columns - it handles as many columns as you want.
It takes care on proper columns order which should be get using COLUMNPROPERTY.
It is based on view COLUMNS so it may include or exclude only specific columns.