scalac compile yields “object apache is not a memb

2020-01-31 15:11发布

问题:

My code is:

import org.apache.spark.SparkContext

It can run in interactive mode, but when I use scalac to compile it, I got the following error message:

object apache is not a member of package org

This seems to be the problem of path, but I do not know exactly how to configure the path.

回答1:

You need to specify the path of libraries used when compiling your Scala code. This is usually not done manually, but using a build tool such as Maven or sbt. You can find a minimal sbt setup at http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications



回答2:

I had this issue because I had the wrong scope for my spark dependency. This is wrong:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>${spark.version}</version>
  <scope>test</scope> <!-- will not be available during compile phase -->
</dependency>

This will work and will not include spark in your uberjar which is what you will almost certainly want:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>${spark.version}</version>
  <scope>provided</scope>
</dependency>


回答3:

One easy way (if you're using the Play Framework) is to look up the LibraryDependacy in the Maven Repository, choose the version, choose SBT and then add it to the bottom of your project/build.sbt file, like so:

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.2"

Afterwards, you'll want to enter reload into the sbt console and then compile. This can feel a little foreign if you're coming from pip or js but the Maven Repo is your friend.



回答4:

I was facing this issue in the sbt interactive session.

Resolved the is by simple executing reload in the session.

Hope this helps!