Spark application throws javax.servlet.FilterRegis

2019-02-04 05:59发布

I'm using Scala to create and run a Spark application locally.

My build.sbt:

name : "SparkDemo"
version : "1.0"
scalaVersion : "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"    exclude("org.apache.hadoop", "hadoop-client")
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"  excludeAll(
ExclusionRule(organization = "org.eclipse.jetty"))
libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"
mainClass in Compile := Some("demo.TruckEvents")

During runtime I get the exception:

Exception in thread "main" java.lang.ExceptionInInitializerError during calling of... Caused by: java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package

The exception is triggered here:

val sc = new SparkContext("local", "HBaseTest")

I am using the IntelliJ Scala/SBT plugin.

I've seen that other people have also this problem suggestion solution. But this is a maven build... Is my sbt wrong here? Or any other suggestion how I can solve this problem?

7条回答
该账号已被封号
2楼-- · 2019-02-04 06:26

If it is happening in Intellij Idea you should go to the project setting and find the jar in the modules, and remove it. Then run your code with sbt through shell. It will get the jar files itself, and then go back to intellij and re-run the code through intellij. It somehow works for me and fixes the error. I am not sure what was the problem since it doesn't show up anymore.

Oh, I also removed the jar file, and added "javax.servlet:javax.servlet-api:3.1.0" through maven by hand and now I can see the error gone.

查看更多
贪生不怕死
3楼-- · 2019-02-04 06:33

If you are running inside intellij, please check in project settings if you have two active modules (one for the project and another for sbt).

Probably a problem while importing existing project.

查看更多
欢心
4楼-- · 2019-02-04 06:39

try running a simple program without the hadoop and hbase dependency

libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"     excludeAll(ExclusionRule(organization = "org.eclipse.jetty"))

libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"


libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"

libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"

libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"

There should be mismatch of the dependencies. also make sure you have same version of jars while you compile and while you run.

Also is it possible to run the code on spark shell to reproduce ? I will be able to help better.

查看更多
你好瞎i
5楼-- · 2019-02-04 06:40

When you use SBT, FilterRegistration class is present in 3.0 and also if you use JETTY Or Java 8 this JAR 2.5 it automatically adds as dependency,

Fix: Servlet-api-2.5 JAR was the mess there, I resolved this issue by adding servlet-api-3.0 jar in dependencies,

查看更多
我欲成王,谁敢阻挡
6楼-- · 2019-02-04 06:43

For me works the following:

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion.value % "provided",
    "org.apache.spark" %% "spark-sql"  % sparkVersion.value % "provided",
    ....................................................................
).map(_.excludeAll(ExclusionRule(organization = "javax.servlet")))
查看更多
Melony?
7楼-- · 2019-02-04 06:45

If you are using IntelliJ IDEA, try this:

  1. Right click the project root folder, choose Open Module Settings
  2. In the new window, choose Modules in the left navigation column
  3. In the column rightmost, select Dependencies tab, find Maven: javax.servlet:servlet-api:2.5
  4. Finally, just move this item to the bottom by pressing ALT+Down.

It should solve this problem.

This method came from http://wpcertification.blogspot.ru/2016/01/spark-error-class-javaxservletfilterreg.html

查看更多
登录 后发表回答