How to downgrade the spark version? What could be the other solutions? I have to connect my hive tables to spark using spark session. But the spark version is not supported by zeppelin.
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
Here are 2 reasons.
[1] Zeppelin 0.7.2 marked spark 2.2+ as the unsupported version.
https://github.com/apache/zeppelin/blob/v0.7.2/spark/src/main/java/org/apache/zeppelin/spark/SparkVersion.java#L40
public static final SparkVersion UNSUPPORTED_FUTURE_VERSION = SPARK_2_2_0;
[2] Even if you change the const and build again, you might fail to run Zeppelin 0.7.2 with Spark 2.2
- https://spark.apache.org/releases/spark-release-2-2-0.html
Spark dropped support for Java 7 and Zeppelin 0.7.2 was built with JDK 7. So you need to rebuild it with JDK 8.
One work around you can use is, specifying JAVA_HOME
in the spark interpreter (for 2.2) as commented here
- https://github.com/apache/zeppelin/pull/2486#issuecomment-314954959
That works because only spark interpreter (for 2.2) requires java 8 and Zeppelin doesn't need it.
In short
- Modify here and rebuild if you want to spark 2.2 on Zeppelin branch-0.7
- And Use JDK 8 for the spark interpreter.
回答2:
Zeppelin 0.7.2 version supports spark 2.1.0.