How to downgrade the spark version? What could be the other solutions? I have to connect my hive tables to spark using spark session. But the spark version is not supported by zeppelin.
相关问题
- How to maintain order of key-value in DataFrame sa
- Spark on Yarn Container Failure
- In Spark Streaming how to process old data and del
- Filter from Cassandra table by RDD values
- Spark 2.1 cannot write Vector field on CSV
相关文章
- 在hive sql里怎么把"2020-10-26T08:41:19.000Z"这个字符串转换成年月日
- Livy Server: return a dataframe as JSON?
- SQL query Frequency Distribution matrix for produc
- Cloudera 5.6: Parquet does not support date. See H
- How to filter rows for a specific aggregate with s
- How to name file when saveAsTextFile in spark?
- Spark save(write) parquet only one file
- Could you give me any clue Why 'Cannot call me
Zeppelin 0.7.2 version supports spark 2.1.0.
Here are 2 reasons.
[1] Zeppelin 0.7.2 marked spark 2.2+ as the unsupported version.
https://github.com/apache/zeppelin/blob/v0.7.2/spark/src/main/java/org/apache/zeppelin/spark/SparkVersion.java#L40
[2] Even if you change the const and build again, you might fail to run Zeppelin 0.7.2 with Spark 2.2
Spark dropped support for Java 7 and Zeppelin 0.7.2 was built with JDK 7. So you need to rebuild it with JDK 8.
One work around you can use is, specifying
JAVA_HOME
in the spark interpreter (for 2.2) as commented hereThat works because only spark interpreter (for 2.2) requires java 8 and Zeppelin doesn't need it.
In short