The root dir: / tmp / hive file on HDFS must be writable. Current permissions: rwx --------
Hi, The following Spark code I ran in Eclipse CDH 5.8 and above RuntimeExeption
public static void main(String[] args) { final SparkConf sparkConf = new SparkConf().setMaster("local").setAppName("HiveConnector"); final JavaSparkContext sparkContext = new JavaSparkContext(sparkConf); SQLContext sqlContext = new HiveContext(sparkContext); DataFrame df = sqlContext.sql("SELECT * FROM test_hive_table1");
According to Exception / tmp / hive on HDFS, it should be writable, however we are performing the spark job in local mode. This means that in the local (linux) file system, and not in HDFS, there is no write permission to the / tmp / hive directory .
So, I executed the command below to give permission.
$ sudo chmod -R 777 /tmp/hive
Now it works for me.
If you get the same problem while running the spark job in cluster mode, you must configure the property below in the hive-site.xml file of the hive conf folder and restart the hive server.
<property> <name>hive.exec.scratchdir</name> <value>/tmp/hive</value> <description>Scratch space for Hive jobs</description> </property> <property> <name>hive.scratch.dir.permission</name> <value>777</value> <description>The permission for the user-specific scratch directories that get created in the root scratch directory </description> </property>
source share