Getting exception : java.lang.NoSuchMethodError: s

2019-06-20 07:41发布

问题:

I am receiving "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)" error while using dataframes in scala app and running it using spark. However if I work using only RDD's and not dataframes, no such error comes up with same pom and settings. Also while going through other posts with same error, it is mentioned that scala version has to be 2.10 as spark is not compatible with 2.11 scala, and i am using 2.10 scala version with 2.0.0 spark.

Below is the snip from pom:



<properties>
      <spark-assembly>/usr/lib/spark/lib/spark-assembly.jar</spark-assembly>
      <encoding>UTF-8</encoding>
      <hadoop.version>2.7.1</hadoop.version>
      <hbase.version>1.1.1</hbase.version>
      <scala.version>2.10.5</scala.version>
      <scala.tools.version>2.10</scala.tools.version>
      <spark.version>2.0.0</spark.version>
      <phoenix.version>4.7.0-HBase-1.1</phoenix.version>
  </properties>

  <dependencies>
      <dependency>
          <groupId>org.scala-lang</groupId>
          <artifactId>scala-library</artifactId>
          <version>${scala.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-client</artifactId>
          <version>${hadoop.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.hbase</groupId>
          <artifactId>hbase-client</artifactId>
          <version>${hbase.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.hbase</groupId>
          <artifactId>hbase-server</artifactId>
          <version>${hbase.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-core_${scala.tools.version}</artifactId>
          <version>${spark.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-sql_${scala.tools.version}</artifactId>
          <version>${spark.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-hive_${scala.tools.version}</artifactId>
          <version>${spark.version}</version>
      </dependency>

  </dependencies>

Error:

16/10/19 02:57:26 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
        at com.abc.xyz.Compare$.main(Compare.scala:64)
        at com.abc.xyz.Compare.main(Compare.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
16/10/19 02:57:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;)
16/10/19 02:57:26 INFO spark.SparkContext: Invoking stop() from shutdown hook

回答1:

Change scala version

<scala.version>2.11.8</scala.version>

<scala.tools.version>2.11</scala.tools.version>

and add

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-reflect</artifactId>
    <version>${scala.version}</version>
  </dependency>


回答2:

I also faced this error and this is purely a version issue.

Your scala version is not compatible or may be you are using the correct version but the intellij libraries has the old version.

Quick fix :

I as using spark 2.2.0 and scala 2.10.4 , which I then changed to scala version 2.11.8.After that do the below:

1) right click on intellij module 2) open module-settings. 3) go to libraries and clear all of them 4) Rebuild

Doing above for me issue is resolved.