How to fix Exception while running locally spark-s

2019-03-04 13:00发布

问题:

I am working on SPARK-SQL 2.3.1 and I am trying to enable the hiveSupport for while creating a session as below

.enableHiveSupport()
.config("spark.sql.warehouse.dir", "c://tmp//hive")

I ran below command

C:\Software\hadoop\hadoop-2.7.1\bin>winutils.exe chmod 777  C:\tmp\hive

While running my program getting:

Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

How to fix this issue and run my local windows machine?

回答1:

Try to use this command:

hadoop fs -chmod -R 777 /tmp/hive/

This is Spark Exception, not Windows. You need to set correct permissions for the HDFS folder, not only for your local directory.