Unable to Connect to remote Apache-Spark

2019-08-16 15:04发布

问题:

I'm new to apache-spark and I'm experiencing some issues while trying to connect from my local machine to a remote server which contains a Spark working instance.

I successfully managed to connect vis SSH tunnel to that server using JSCH but I get the following error:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$; at org.apache.spark.ui.jobs.AllJobsPage.(AllJobsPage.scala:39) at org.apache.spark.ui.jobs.JobsTab.(JobsTab.scala:38) at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:65) at org.apache.spark.ui.SparkUI.(SparkUI.scala:82) at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:220) at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:162) at org.apache.spark.SparkContext.(SparkContext.scala:452) at server.Server$.main(Server.scala:45) at server.Server.main(Server.scala)

When trying to connect to Spark.

This is my scala code

val conf = new SparkConf().setAppName("Test").setMaster("spark://xx.xxx.xxx.x:7077")
val sc = new SparkContext(conf)
val rdd = sc.parallelize(Array(1, 2, 3, 4, 5)).count()
println(rdd)

Where line 45 highlighted at (Server.scala:45) in the error is the one with new SparkContext(conf).

Both on local and remote machine I'm using scala ~ 2.11.6. On my local pom.xml file I imported scala : 2.11.6, spark-core_2.10 and spark-sql_2.10 both ~2.1.1. On my server I installed spark ~ 2.1.1. ON the server I also managed to setup the master as the local machine by editing conf/spark-env.sh.

Of course, I managed to test server's spark and It works just fine.

What Am I doing wrong?

回答1:

from the docs of setMaster:

The master URL to connect to, such as "local" to run locally with one thread, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.

If you run it from the spark cluster (as I understand you are), you should use local[n]