The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--------
Hi, The following Spark code i was executing in Eclipse of CDH 5.8 & getting above RuntimeExeption
public static void main(String[] args) {
final SparkConf sparkConf = new SparkConf().setMaster("local").setAppName("HiveConnector");
final JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
SQLContext sqlContext = new HiveContext(sparkContext);
DataFrame df = sqlContext.sql("SELECT * FROM test_hive_table1");
//df.show();
df.count();
}
According to Exception /tmp/hive on HDFS should be writable, however we are executing spark job in local mode. That means there is no writable permission to the directory /tmp/hive in local (linux) file system, not HDFS.
So I had executed below command to gave permission.
$ sudo chmod -R 777 /tmp/hive
Now it is working for me.
If you are getting the same issue during execution of spark job in cluster mode you should configure below property in hive-site.xml file of hive conf folder and restart hive server.
<property>
<name>hive.exec.scratchdir</name>
<value>/tmp/hive</value>
<description>Scratch space for Hive jobs</description>
</property>
<property>
<name>hive.scratch.dir.permission</name>
<value>777</value>
<description>The permission for the user-specific scratch directories that get created in the root scratch directory </description>
</property>
use proper 64bit winutils and set permission
winutils.exe chmod -R 777 \tmp\hive