Running spark scala example fails

2019-02-04 05:13发布

I'm new to both Spark and Scala. I've created an IntelliJ Scala project with SBT and added a few lines to build.sbt.

name := "test-one"

version := "1.0"

scalaVersion := "2.11.2"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"

My version of Scala is 2.10.4 but this problem also occurs with 2.11.2

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
    at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
    at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)
    at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
    at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
    at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
    at TweeProcessor$.main(TweeProcessor.scala:10)
    at TweeProcessor.main(TweeProcessor.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 23 more

Tried looking up online, most answers point to a mismatch between API versions and Scala version, but none are specific to Spark.

4条回答
Animai°情兽
2楼-- · 2019-02-04 05:37

Downgrade the scala version to 2.10.4

name := "test-one"

version := "1.0"

//scalaVersion := "2.11.2"
scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
查看更多
时光不老,我们不散
3楼-- · 2019-02-04 05:50

This is a version compatibility issue. Spark_core 2.10 is build using scala 2.10, and your sbt file mention you are using scala 2.11. Either downgrade your scala version to 2.10 or upgrade your spark to 2.11

查看更多
Juvenile、少年°
4楼-- · 2019-02-04 05:56
scalaVersion := "2.11.1"
libraryDependencies ++= Seq(
        "org.apache.spark" % "spark-core_2.11" % "2.2.0",
        "org.apache.spark" % "spark-sql_2.11" % "2.2.0"
         )

This configuration worked for me.

查看更多
我想做一个坏孩纸
5楼-- · 2019-02-04 05:57

spark-core_2.10 is built for use with 2.10.x versions of scala. You should use

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"

which will select the correct _2.10 or _2.11 version for your scala version.

Also make sure you're compiling against the same versions of scala and spark as the ones on the cluster where you're running this.

查看更多
登录 后发表回答