I am using ubuntu 14.04 LTS Java version 8 and Hadoop 2.5.1 for installation. I have completed this installation guide for all components. Sorry for not using michael noll. Now the problem I am facing is when I run -dfs.sh, I get the following message
oroborus@Saras-Dell-System-XPS-L502X :~$ start-dfs.sh <br> 14/11/12 16:12:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable<br> Starting namenodes on [localhost]<br> localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-oroborus-namenode-Saras-Dell-System-XPS-L502X.out<br> localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-oroborus-datanode-Saras-Dell-System-XPS-L502X.out<br> Starting secondary namenodes [0.0.0.0]<br> 0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-oroborus-secondarynamenode-Saras-Dell-System-XPS-L502X.out<br> 14/11/12 16:12:48 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable<br>
Now, after starting start-yarn.sh (which works fine) and jps, I get the following output
oroborus@Saras-Dell-System-XPS-L502X :~$ jps 9090 NodeManager 5107 JobHistoryServer 8952 ResourceManager 12442 Jps 11981 NameNode
The ideal output should contain a datanode, but it is not. Googling and SOing I learned a bit that the error breaks in the logs, so here are the logs for the datanode. (Only part of the error, if you need more, let me know)
2014-11-08 23:30:32,709 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2014-11-08 23:30:33,132 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /usr/local/hadoop_store/hdfs/datanode : EPERM: Operation not permitted at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:226) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:642) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:472) at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:126) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:142) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:1866) at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:1908) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1890) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1782) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1829) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2005) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2029) 2014-11-08 23:30:33,134 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/usr/local/hadoop_store/hdfs/datanode/" at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:1917)
Now I doubt how to do this.
Help is appreciated.
Postscript I tried many forums, so none of them could solve this problem for me. Hence the question.
source share