What is the recommended way of running an end to end integration test for multiple Spring
boot applications in the Maven
build's verify phase?
Basically, I have a multi-module Maven project where several modules are separate spring boot applications. These separate applications have their own configuration for data sources, integration flows with JMS
queues, etc. For example, application A will poll a database for an event, and when that occurs, it produces a JSON
file of data and puts a message on a JMS
queue. Application B is polling the JMS
queue, so picks up the message, reads the file, does some processing using another database, and puts a message on a different queue. Application C will then pick up that message, etc, etc.
I have set up integration tests for the individual applications; these run under the Maven failsafe plugin. However, I would like to integration test the whole system, end to end, under Maven. I have set up a separate module in the project dedicated to this task, and so would like the verify build phase of this module to do the end to end testing using the other dependent modules.
Is there a best practice way of doing this? I see 3 potential ways:
- Load each application's configuration into the same application context. However, because of multiple data sources etc, this creates conflicts, and so these data sources would all have to be manually configured just to enable end to end integration testing - so this seems wrong to me.
- Launch each application as a separate process - how then to properly keep track of them and make sure they get shut down if the test module build stops/crashes/etc?
- Is there a way to easily load separate spring boot applications, each with its own configuration context, in the same process? This would seem to be the most sensible option. Are there any considerations in respect of the
Maven
build/failsafe plugin?
Very nice question! I would by myself be interested what other people answer. I'll share my opinion.
In my understanding first of all you should know what exactly you want to test.
Integration tests should work with an application of at least with a part of it and ensure that the component you've developed works properly in a semi-real environment. It seems like you've already done that.
Now, regarding system tests (I intentionally differentiate between integration and system tests). These should 'mimic' QA guys :) So, they treat a system as a black box. They can't invoke any internal APIs and run real flows.
End-to-end tests IMO fall into this category.
In this case you would like to check them against the system deployed like in production, with the classpath like in production.
So I don't really believe in option 1 just like you.
Regarding the option 3 I'm not sure whether its a good solution as well.
Even if you run your stuff with different application contexts (I don't know much Spring boot so I can't technically comment on it), in my understanding they will share the same classpath in runtime, so probably you're in risk to get clash among your thirdparties (although I know that spring boot defines a lot of versions of jars by itself, you know what I mean) especially when you upgrade only one module and probably change the dependencies.
So you don't really know what exactly runs in memory when you run follow this approach.
So, for end-to-end tests, I would go with option 2.
Regarding the synchronization, probably the option would be implementing some logic at the application level in conjunction with process state tracking at the level of operating system.
One more point I would like to comment on is that end-to-end tests in general are still functional testing (they check the functional behavior of the system) so in general you shouldn't check system crashes there in each test. If you check the system crash for each flow, these tests will be too slow.
Of course you can maintain one relatively small test suite to check corner cases as such.
Hope this helps
Just to follow-up and say what I ended up doing (which is continuing to work well):
- I created the following Maven profiles in my testing module: "default" to skip tests by default (we use jgitflow plugin so only want end-to-end tests run when explicitly requested), "standalone-e2e" for end-to-end tests not requiring external resources such as databases (aimed at developers wanting to do a full end-to-end test), and "integrated-e2e" for end-to-end tests using real databases etc (which can get triggered as part of CI). Spring profiles (activated by the corresponding Maven profile) control the configuration of the individual components.
- For standalone-e2e, relevant plugins such as activemq-maven-plugin, hsqldb-maven-plugin etc. launch (and later shut down) resources as part of the end-to-end test, running on ports reserved with build-helper-maven-plugin. The process-exec-maven-plugin is used to launch all the components to be tested in the pre-integration-test phase (as standard Spring Boot apps), and it automatically takes care of shutting them down in the post-integration-test phase. Spring configuration and specific Maven test dependencies take care of other resources such as a fake FTP server. After all resources and components are running, the test code itself then populates the database and file system as required and triggers flows (and waits for corresponding replies etc) using JMS.
- The integrated-e2e profile is almost identical but uses "real" external resources (in our case, Amazon SQS queues, MySQL database, etc) configured in the associated Spring properties.
- All files needed for and generated by the tests (e.g. data files, HSQLDB files, log files, etc) are created under the "target" build directory, so it's easy to inspect this area to see what happened during the test, and also allow "mvn clean" to clear out everything.
I hope that's useful - it was certainly refreshing to find that whatever I needed to do, a Maven plugin existed to take care of it!