Is there a way to listen for when Spark framework is shutting down, to run some cleanup? For example, I want to close out my ElasticSearch client.
相关问题
- Delete Messages from a Topic in Apache Kafka
- Jackson Deserialization not calling deserialize on
- How to maintain order of key-value in DataFrame sa
- StackExchange API - Deserialize Date in JSON Respo
- Difference between Types.INTEGER and Types.NULL in
As @Martin Eden explains an approach is to use
Runtime.getRuntime().addShutdownHook(...);
but this has nothing to do withspark
server (jetty) lifecycle. Actually, you could stop the server without stopping the app and this shutdown hook wouldn't run any hook-cleanup added to theruntime
. So this would cleanup your app if you stop it.Another option is to add a Lifecycle (managed) bean bean in Jetty and set the property stop at shutdown as true
server.setStopAtShutdown(true);
.If you don't set a stop shutdown you could register when the spark service stops:
One approach is to use
Runtime.getRuntime().addShutdownHook()
.This is a general Java mechanism for running code when the program exits. Since a Spark Framework web application is just a normal web application, this will work. See the Java documentation.
However, this hook will not be run if the VM process is aborted using the SIGKILL signal on Unix or the TerminateProcess call on Microsoft Windows. Note that this applies to pressing the "Stop" button in most IDEs.