I'm using Scala to create and run a Spark application locally.
My build.sbt:
name : "SparkDemo"
version : "1.0"
scalaVersion : "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0" exclude("org.apache.hadoop", "hadoop-client")
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0" excludeAll(
ExclusionRule(organization = "org.eclipse.jetty"))
libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"
mainClass in Compile := Some("demo.TruckEvents")
During runtime I get the exception:
Exception in thread "main" java.lang.ExceptionInInitializerError during calling of... Caused by: java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
The exception is triggered here:
val sc = new SparkContext("local", "HBaseTest")
I am using the IntelliJ Scala/SBT plugin.
I've seen that other people have also this problem suggestion solution. But this is a maven build... Is my sbt wrong here? Or any other suggestion how I can solve this problem?
See my answer to a similar question here. The class conflict comes about because HBase depends on
org.mortbay.jetty
, and Spark depends onorg.eclipse.jetty
. I was able to resolve the issue by excludingorg.mortbay.jetty
dependencies from HBase.If you're pulling in
hadoop-common
, then you may also need to excludejavax.servlet
fromhadoop-common
. I have a working HBase/Spark setup with my sbt dependencies set up as follows: