I regularly face issues when deploying my Liquibase scripts on Oracle, because I don't have an easy way to run them before I deploy them. They always run fine on a fresh H2 DB, but when I deploy in the pipeline, I often face basic issues.
I would like to implement some quality checks on my scripts before they are deployed, typically as part of the build. Is there an easy way to do that ? For example, something basic like checking that column and table names are not too long for Oracle.. Because I don't do it very often, I tend to forget about that limit, and when I deploy on Oracle, I get a bad surprise..
Thanks !
You may be ready to start looking at a commercial solution like Datical DB that uses Liquibase internally, but adds things like forecasting, which simulates applying the changes to an abstract in-memory database model. We also have a rules engine that can do very complex checks of your database and your changelog. Full disclosure - I am an employee of Datical.
I found a way to parse the latest Liquibase files I create, and perform some basic checks. I guess this can be extended for more advanced checks, but that's already quite nice. Here are 4 tests, using AssertJ for assertions
you can try the following batch in order to rollback the database state to date/time right before if any issue happen during update process. of corse you can run this batch using maven during test scope
i am using liquibase-core-3.3.2.jar & ojdbc5-11.1.0.7.0.jar with the following liquibase.properties file.