NoSuchMethodError when using Sparka and IntelliJ

2020-07-23 05:03发布

问题:

I'm new to Scala and Spark. I've been frustrated by how hard it has been to get things to work with IntelliJ. Currently, I can't get run the code below. I'm sure it's something simple, but I can't get it to work.

I'm trying to run:

import org.apache.spark.{SparkConf, SparkContext}

object TestScala {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf()
    conf.setAppName("Datasets Test")
    conf.setMaster("local[2]")
    val sc = new SparkContext(conf)
    println(sc)
  }
}

The error I get is:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1413)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
at TestScala$.main(TestScala.scala:13)
at TestScala.main(TestScala.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

My build.sbt file:

name := "sparkBook"

version := "1.0"

scalaVersion := "2.12.1"

回答1:

Change your scalaVersion to 2.11.8 and add the Spark dependency to your build.sbt:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"



回答2:

One more scenario is intellij is pointing to 2.12.4 and all the maven/sbt dependencies are 2.11.8. with scala dep verion 2.11...

I stepped back from 2.12.4 to 2.11.8 at global libraries of intellij ui. and it started working

Details :

Maven pom.xml pointing to 2.11.8 But in my Intellij... sdk is 2.12.4 in global libraries shown below. Which is causing

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

Stepped back to 2.11.8 in Global libraries.. like below

Thats it.. Problem solved. No more error for executing that program.

Conclusion : Maven dependencies alone should not solve the problem, along with that we have to configure scala sdk in global libraries since its error is coming while running a spark local program and error is related to Intellij run time.



回答3:

If you use spark 2.4.3, you need to use scala 2.11 even though spark website says to use scala 2.12. https://spark.apache.org/docs/latest/

To avoid scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;