We use an Oracle(or postgres) database and an application server to execute integration tests. To isolate each test from another one , the database schema is dropped and re-created before each test.
As you see this is a time taking process. The application uses 100+ tables. We are thinking of writing custom sql to delete unwanted data from each tables. Is there a better way to save and restore database state?
( It appears DBUnit could do this, I have not tried it yet. )
A single test involves:
- create database schema.
- Start app server.
- Start multiple client applications.
- Execute and verify.
We have 5000 odd tests, taking 700 hours or so. (we do it on a grid environment, finishes overnight)
Most of the tests uses small data sizes, say up to 10 MB.
Oracle Flashback allows you to restore a table at a specified time point using a simple SQL query. The documentation is available here.
I don't know if Postgre has a similar feature.
What version of Oracle (enterprise 10g+ or standard)? Assuming you're using Enterprise, you can use Flashback database. You build out your baseline DB. Then
This should be enough to get you started. If you want additional detail let me know.
If each test fits in a single transaction, you can simply rollback. Is that an option?
For PostgreSQL, I think using a template database is quicker than re-creating all tables individually.
Just create a new database with a different name than the one that you use normally (e.g. my_template_db) but with all tables that you need. You can put testdata in there as well.
When you run the test, drop the database that you want to test. Then re-create the test using the template.
There were some optimizations in 9.0 that would speed this up. So maybe that approach is faster than re-creating all tables through SQL.
For Oracle, you can use this pl/sql package: snapshot.sql
We have 500 tables, 30 of which are restored after each test, and it takes ~500ms on average.
Usage is extremely simple:
Questions
Suggestions