-->

Spark Framework: Listen for server stop

2019-08-09 12:04发布

问题:

Is there a way to listen for when Spark framework is shutting down, to run some cleanup? For example, I want to close out my ElasticSearch client.

回答1:

One approach is to use Runtime.getRuntime().addShutdownHook().

This is a general Java mechanism for running code when the program exits. Since a Spark Framework web application is just a normal web application, this will work. See the Java documentation.

However, this hook will not be run if the VM process is aborted using the SIGKILL signal on Unix or the TerminateProcess call on Microsoft Windows. Note that this applies to pressing the "Stop" button in most IDEs.



回答2:

As @Martin Eden explains an approach is to use Runtime.getRuntime().addShutdownHook(...); but this has nothing to do with spark server (jetty) lifecycle. Actually, you could stop the server without stopping the app and this shutdown hook wouldn't run any hook-cleanup added to the runtime. So this would cleanup your app if you stop it.

Another option is to add a Lifecycle (managed) bean bean in Jetty and set the property stop at shutdown as true server.setStopAtShutdown(true);.

EmbeddedServers.add(EmbeddedServers.Identifiers.JETTY, new MyJettyFactory());

static class MyJettyFactory implements EmbeddedServerFactory {

      public EmbeddedServer create(Routes routeMatcher, StaticFilesConfiguration staticFilesConfiguration, boolean hasMultipleHandler) {
          MatcherFilter matcherFilter = new MatcherFilter(routeMatcher, staticFilesConfiguration, false, hasMultipleHandler);
          matcherFilter.init(null);

          final Handler handler = new JettyHandler(matcherFilter);

          return new EmbeddedJettyServer((maxThreads, minThreads, threadTimeoutMillis) -> {
            final Server server = new Server();
            server.setStopAtShutdown(true);
            server.setStopTimeout(Duration.of(30, ChronoUnit.SECONDS).toMillis());
            server.addBean(new ManagedObjects(new Managed() {
                @Override
                public void doStart() {

                }

                @Override
                public void doStop() {
                    System.out.println("Good bye!");
                }
            }));
            return server;
        }, handler);

    }
}

If you don't set a stop shutdown you could register when the spark service stops:

final Service ignite = Service.ignite();
Runtime.getRuntime().addShutdownHook(service::stop);