spark-submit on standalone cluster complain about

2019-07-23 12:32发布

I'm new to Spark and downloaded a pre-compiled Spark binaries from Apache (Spark-2.1.0-bin-hadoop2.7)

When submitting my scala (2.11.8) uber jar the cluster throw and error:

java.lang.IllegalStateException: Library directory '/root/spark/assembly/target/scala-2.10/jars' does not exist; make sure Spark is built

I'm not running Scala 2.10 and Spark isn't compiled (as much as I know) with Scala 2.10

Could it be that one of my dependencies is based on Scala 2.10 ?
Any suggestions what can be wrong ?

2条回答
等我变得足够好
2楼-- · 2019-07-23 13:12

Try setting SPARK_HOME="location to your spark installation" on your system or IDE

查看更多
Luminary・发光体
3楼-- · 2019-07-23 13:18

Note sure what is wrong with the pre-built spark-2.1.0 but I've just downloaded spark 2.2.0 and it is working great.

查看更多
登录 后发表回答