What are the best methods for tracking and/or automating DB schema changes? Our team uses Subversion for version control and we've been able to automate some of our tasks this way (pushing builds up to a staging server, deploying tested code to a production server) but we're still doing database updates manually. I would like to find or create a solution that allows us to work efficiently across servers with different environments while continuing to use Subversion as a backend through which code and DB updates are pushed around to various servers.
Many popular software packages include auto-update scripts which detect DB version and apply the necessary changes. Is this the best way to do this even on a larger scale (across multiple projects and sometimes multiple environments and languages)? If so, is there any existing code out there that simplifies the process or is it best just to roll our own solution? Has anyone implemented something similar before and integrated it into Subversion post-commit hooks, or is this a bad idea?
While a solution that supports multiple platforms would be preferable, we definitely need to support the Linux/Apache/MySQL/PHP stack as the majority of our work is on that platform.
I've used the following database project structure in Visual Studio for several projects and it's worked pretty well:
Database
Our build system then updates the database from one version to the next by executing the scripts in the following order:
Each developer checks in their changes for a particular bug/feature by appending their code onto the end of each file. Once a major version is complete and branched in source control, the contents of the .sql files in the Change Scripts folder are deleted.
Toad for MySQL has a function called schema compare that allows you to synchronise 2 databases. It is the best tool I have used so far.
IMHO migrations do have a huge problem:
Upgrading from one version to another works fine, but doing a fresh install of a given version might take forever if you have hundreds of tables and a long history of changes (like we do).
Running the whole history of deltas since the baseline up to the current version (for hundreds of customers databases) might take a very long time.
We use a very simple but yet effective solution.
For new installs, we have a metadata.sql file in the repository which holds all the DB schema, then in the build process we use this file to generate the database.
For updates, we add the updates in the software hardcoded. We keep it hardcoded because we don't like solving problems before it really IS a problem, and this kind of thing didn't prove to be a problem so far.
So in our software we have something like this:
RegisterUpgrade(1, 'ALTER TABLE XX ADD XY CHAR(1) NOT NULL;');
This code will check if the database is in version 1 (which is stored in a table created automatically), if it is outdated, then the command is executed.
To update the metadata.sql in the repository, we run this upgrades locally and then extract the full database metadata.
The only thing that happens every so often, is to forget commiting the metadata.sql, but this isn't a major problem because its easy to test on the build process and also the only thing that could happen is to make a new install with an outdated database and upgraded it on the first use.
Also we don't support downgrades, but it is by design, if something breaks on an update, we restored the previous version and fix the update before trying again.
If you are still looking for solutions : we are proposing a tool called neXtep designer. It is a database development environment with which you can put your whole database under version control. You work on a version controlled repository where every change can be tracked.
When you need to release an update, you can commit your components and the product will automatically generate the SQL upgrade script from the previous version. Of course, you can generate this SQL from any 2 versions.
Then you have many options : you can take those scripts and put them in your SVN with your app code so that it'll be deployed by your existing mechanism. Another option is to use the delivery mechanism of neXtep : scripts are exported in something called a "delivery package" (SQL scripts + XML descriptor), and an installer can understand this package and deploy it to a target server while ensuring strcutural consistency, dependency check, registering installed version, etc.
The product is GPL and is based on Eclipse so it runs on Linux, Mac and windows. It also support Oracle, Mysql and Postgresql at the moment (DB2 support is on the way). Have a look at the wiki where you will find more detailed information : http://www.nextep-softwares.com/wiki
I create folders named after the build versions and put upgrade and downgrade scripts in there. For example, you could have the following folders: 1.0.0, 1.0.1 and 1.0.2. Each one contains the script that allows you to upgrade or downgrade your database between versions.
Should a client or customer call you with a problem with version 1.0.1 and you are using 1.0.2, bringing the database back to his version will not be a problem.
In your database, create a table called "schema" where you put in the current version of the database. Then writing a program that can upgrade or downgrade your database for you is easy.
Just like Joey said, if you are in a Rails world, use Migrations. :)