Hdfs dfs -mkdir, There is no such file or directory

Hey. I am new to hadoop and am trying to create a directory in hdfs called twitter_data. I installed my vm on softlayer, installed and successfully started hasoop.

This is the approval I am trying to run:

hdfs dfs -mkdir hdfs: // localhost: 9000 / user / Hadoop / twitter_data p>

And he keeps returning this error message:

/usr/local/hadoop/etc/hadoop/hadoop-env.sh: line 2: ./hadoop-env.sh: Permission denied 16/10/19 19:07:03 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable mkdir: `hdfs://localhost:9000/user/Hadoop/twitter_data': No such file or directory 

Why does he say that there is no such file and directory? I order it to create a catalog, do I just need to create it? I assume this is a resolution problem, but I cannot solve it. Please help me with hdfs experts. I spent too much time on what seems simple.

Thanks in advance.

+5
source share
2 answers

This is because parent directories do not yet exist. Try hdfs dfs -mkdir -p /user/Hadoop/twitter_data . The -p flag indicates that all non-existent directories leading to this directory should also be created.

Regarding the question that you asked in the comments, just enter http://<host name of the namenode>:<port number>/ into your browser.

+7
source

use the following command to create the directory:

1) do not run hadoop and format namenode: -

 $ hadoop namenode -format 

2) run hasoop with: -

 $ start-all.sh 

3) now first create the source directory, then create another in the same directory:

 $ hadoop fs -mkdir /user $ hadoop fs -mkdir /user/Hadoop $ hadoop fs -mkdir /user/Hadoop/tweeter_data 

Follow the steps above to resolve the issue.

+4
source

Source: https://habr.com/ru/post/1258504/


All Articles