Basic permissions error in MR2

A recent build of the basic MR2 examples failed, that is, running the pi example on a distributed MR2 HDFS cluster with the alias psuedo with the following error:

07/13/06 21:20:47 ERROR security.UserGroupInformation: PriviledgedActionException as: root (auth: SIMPLE) throws: org.apache.hadoop.security.AccessControlException: Permission denied: user = root, access = EXECUTE, info nodes = "/ TMP / Hadoop yarn / staging": mapred: mapred: drwxrwx ---

Why can this happen?

+4
source share
3 answers

Solution, just change the permissions of / tmp / hadoop -yarn:

sudo -u hdfs hadoop fs -chmod -R 777 / tmp / hadoop-yarn

Just for imagination, how this directory may be with the wrong permissions, given that it was completely created by the internal hadoop life cycle.

(Comments will be appreciated)

+4
source

Add yarn.app.mapreduce.am.staging-dir to your mapred-site.xml as follows:

 <property> <name>yarn.app.mapreduce.am.staging-dir</name> <value>/user</value> </property> 

In this configuration, it is assumed that the user account in your case root has its own directory /user/root on HDFS, and the intermediate directory will be created as /user/root/.staging , where the user account already has access rights.

For more information, see “Step 4: Configure the staging directory” at the following link.

+2
source

U was getting this error in HDP to run an example jar file for wordcount called:

org.apache.hadoop.ipc.RemoteException (org.apache.hadoop.security.AccessControlException): Permission denied: user = root, access = EXECUTE, info nodes = "/user/root/.staging": HDFS: HDFS: drwx ------

From hdfs user chmod 777 in the / user directory, and I could use my ubuntu sudoer user to run the .jar file. Also I could use the hdfs user to run the jar.

-1
source

Source: https://habr.com/ru/post/1490076/


All Articles