Need to alter column types in production database

2019-06-10 20:52发布

问题:

I need help writing a TSQL script to modify two columns' data type.

We are changing two columns:

  • uniqueidentifier -> varchar(36) * * * has a primary key constraint
  • xml -> nvarchar(4000)

My main concern is production deployment of the script...

The table is actively used by a public website that gets thousands of hits per hour. Consequently, we need the script to run quickly, without affecting service on the front end. Also, we need to be able to automatically rollback the transaction if an error occurs.

Fortunately, the table only contains about 25 rows, so I am guessing the update will be quick.

This database is SQL Server 2005.

(FYI - the type changes are required because of a 3rd-party tool which is not compatible with SQL Server's xml and uniqueidentifier types. We've already tested the change in dev and there are no functional issues with the change.)

回答1:

As David said, execute a script in a production database without doing a backup or stop the site is not the best idea, that said, if you want to do changes in only one table with a reduced number of rows you can prepare a script to :

  • Begin transaction
  • create a new table with the final structure you want.
  • Copy the data from the original table to the new table
  • Rename the old table to, for example, original_name_old
  • Rename the new table to original_table_name
  • End transaction

This will end with a table that is named as the original one but with the new structure you want, and in addition you maintain the original table with a backup name, so if you want to rollback the change you can create a script to do a simple drop of the new table and rename of the original one.

If the table has foreign keys the script will be a little more complicated, but is still possible without much work.



回答2:

Consequently, we need the script to run quickly, without affecting service on the front end.

This is just an opinion, but it's based on experience: That's a bad idea. It's better to have a short, (pre-announced if possible) scheduled downtime than to take the risk.

The only exception is if you really don't care if the data in these tables gets corrupted, and you can be down for an extended period.

In this situation, based on th types of changes you're making and the testing you've already performed, it sounds like the risk is very minimal, since you've tested the changes and you SHOULD be able to do it safely, but nothing is guaranteed.

First, you need to have a fall-back plan in case something goes wrong. The short version of a MINIMAL reasonable plan would include:

  • Shut down the website
  • Make a backup of the database
  • Run your script
  • test the DB for integrity
  • bring the website back online

It would be very unwise to attempt to make such an update while the website is live. you run the risk of being down for an extended period if something goes wrong.

A GOOD plan would also have you testing this against a copy of the database and a copy of the website (a test/staging environment) first and then taking the steps outlined above for the live server update. You have already done this. Kudos to you!

There are even better methods for making such an update, but the trade-off of down time for safety is a no-brainer in most cases.



回答3:

And if you absolutely need to do this in live then you might consider this:

1) Build an offline version of the table with the new datatypes and copied data. 2) Build all the required keys and indexes on the offline tables. 3) swap the tables out in a transaction. 00 you could rename the old table to something else as an emergency backup.

sp_help 'sp_rename'

But TEST FIRST all of this in a prod like environment. And make sure your backups are up to date. AND do this when you are least busy.