-->

How can a native Servlet Filter be used when using

2020-07-11 07:53发布

问题:

I'm playing around with Spark (the Java web framework, not Apache Spark).

I find it really nice and easy to define routes and filters, however I'm looking to apply a native servlet filter to my routes and can't seem to find a way to do that.

More specifically, I would like to use Jetty's DoSFilter which is a servlet filter (contrast with the Spark Filter definition). Since Spark is using embedded Jetty, I don't have a web.xml to register the DoSFilter. However, Spark doesn't expose the server instance so I can't find an elegant way of registering the filter programatically either.

Is there a way to apply a native servlet filter to my routes?

I thought of wrapping the DoSFilter in my own Spark Filter, but it seemed like a weird idea.

回答1:

You can do it like this:

public class App {
 private static Logger LOG = LoggerFactory.getLogger(App.class);

 public static void main(String[] args) throws Exception {

    ServletContextHandler mainHandler = new ServletContextHandler();
    mainHandler.setContextPath("/base/path");

    Stream.of(
            new FilterHolder(new MyServletFilter()),
            new FilterHolder(new SparkFilter()) {{
                this.setInitParameter("applicationClass", SparkApp.class.getName());
            }}
    ).forEach(h -> mainHandler.addFilter(h, "*", null));

    GzipHandler compression = new GzipHandler();
    compression.setIncludedMethods("GET");
    compression.setMinGzipSize(512);
    compression.setHandler(mainHandler);

    Server server = new Server(new ExecutorThreadPool(new ThreadPoolExecutor(10,200,60000,TimeUnit.MILLISECONDS,
                                                                          new ArrayBlockingQueue<>(200),
                                                                       new CustomizableThreadFactory("jetty-pool-"))));

    final ServerConnector serverConnector = new ServerConnector(server);
    serverConnector.setPort(9290);
    server.setConnectors(new Connector[] { serverConnector });

    server.setHandler(compression);
    server.start();

    hookToShutdownEvents(server);

    server.join();
}

private static void hookToShutdownEvents(final Server server) {
    LOG.debug("Hooking to JVM shutdown events");

    server.addLifeCycleListener(new AbstractLifeCycle.AbstractLifeCycleListener() {

        @Override
        public void lifeCycleStopped(LifeCycle event) {
            LOG.info("Jetty Server has been stopped");
            super.lifeCycleStopped(event);
        }

    });

    Runtime.getRuntime().addShutdownHook(new Thread() {
        @Override
        public void run() {
            LOG.info("About to stop Jetty Server due to JVM shutdown");
            try {
                server.stop();
            } catch (Exception e) {
                LOG.error("Could not stop Jetty Server properly", e);
            }
        }
    });
}

/**
 * @implNote {@link SparkFilter} needs to access a public class
 */
@SuppressWarnings("WeakerAccess")
public static class SparkApp implements SparkApplication {

    @Override
    public void init() {
        System.setProperty("spring.profiles.active", ApplicationProfile.readProfilesOrDefault("dev").stream().collect(Collectors.joining()));
        AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(ModocContext.class);
        ctx.registerShutdownHook();
    }

}}