eclipse(set with scala envirnment) : object apache

2019-07-19 06:10发布

enter image description here

As shown in image, its giving error when i am importing the Spark packages. Please help. When i hover there, it shows "object apache is not a member of package org". I searched on this error, it shows spark jars has not been imported. So, i imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. But still same error.Below is what i actually want to run:

 import org.apache.spark.{SparkConf, SparkContext}
 object ABC {

 def main(args: Array[String]){
//Scala Main Method

println("Spark Configuration")

val conf = new SparkConf()

conf.setAppName("My First Spark Scala Application")

conf.setMaster("spark://ip-10-237-224-94:7077")

println("Creating Spark Context")
}
}

4条回答
相关推荐>>
2楼-- · 2019-07-19 06:44

Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.

查看更多
做个烂人
3楼-- · 2019-07-19 06:44

If you are doing this in the context of Scala within a Jupyter Notebook, you'll get this error. You have to install the Apache Toree kernel:

https://github.com/apache/incubator-toree

and create your notebooks with that kernel.

You also have to start the Jupyter Notebook with:

pyspark
查看更多
神经病院院长
4楼-- · 2019-07-19 07:01

I was also receiving the same error, in my case it was compatibility issue. As Spark 2.2.1 is not compatible with Scala 2.12(it is compatible with 2.11.8) and my IDE was supporting Scala 2.12.3. I resolved my error by

1) Importing the jar files from the basic folder of Spark. During the installation of Spark in our C drive we have a folder named Spark which contains Jars folder in it. In this folder one can get all the basic jar files. Goto to Eclipse right click on the project -> properties-> Java Build Path. Under 'library' category we will get an option of ADD EXTERNAL JARs.. Select this option and import all the jar files of 'jars folder'. click on Apply.

2) Again goto properties -> Scala Compiler ->Scala Installation -> Latest 2.11 bundle (dynamic)* *before selecting this option one should check the compatibility of SPARK and SCALA.

查看更多
我欲成王,谁敢阻挡
5楼-- · 2019-07-19 07:07

The problem is Scala is NOT backward compatible. Hence each Spark module is complied against specific Scala library. But when we run from eclipse, we have one SCALA VERSION which was used to compile and create the spark Dependency Jar which we add to the build path, and SECOND SCALA VERSION is there as the eclipse run time environment. Both may conflict.

This is a hard reality, although, we wish Scala to be ,backward compatible. Or at least a complied jar file created could be backward compatible. Hence, the recommendation is , use Maven or similar where dependency version can be managed.

查看更多
登录 后发表回答