I am running Hadoop 2.2.0.2.0.6.0-101 on one node. I am trying to run a Java MRD program that writes data to an existing Hive table from Eclipse as a regular user. I get an exception:
org.apache.hadoop.security.AccessControlException: Permission denied: user=dev, access=WRITE, inode="/apps/hive/warehouse/testids":hdfs:hdfs:drwxr-xr-x
This is because an ordinary user does not have write hdfsaccess to the warehouse directory, only the user does:
drwxr-xr-x - hdfs hdfs 0 2014-03-06 16:08 /apps/hive/warehouse/testids
drwxr-xr-x - hdfs hdfs 0 2014-03-05 12:07 /apps/hive/warehouse/test
To get around this, I change permissions on the repository directory, so everyone now has write permissions:
[hdfs@localhost wks]$ hadoop fs -chmod -R a+w /apps/hive/warehouse
[hdfs@localhost wks]$ hadoop fs -ls /apps/hive/warehouse
drwxrwxrwx - hdfs hdfs 0 2014-03-06 16:08 /apps/hive/warehouse/testids
drwxrwxrwx - hdfs hdfs 0 2014-03-05 12:07 /apps/hive/warehouse/test
This helps to some extent, and the MRD program can now write as a regular user to the warehouse directory, but only once. When I try to write data to the same table a second time, I get:
ERROR security.UserGroupInformation: PriviledgedActionException as:dev (auth:SIMPLE) cause:org.apache.hcatalog.common.HCatException : 2003 : Non-partitioned table already contains data : default.testids
, hive, , :
[hdfs@localhost wks]$ hadoop fs -ls /apps/hive/warehouse
drwxr-xr-x - hdfs hdfs 0 2014-03-11 12:19 /apps/hive/warehouse/testids
drwxrwxrwx - hdfs hdfs 0 2014-03-05 12:07 /apps/hive/warehouse/test
, Hive, , Hive:
!