HDFS error: `input ': no ​​such file or directory

I installed hasoop 2.6.0 and I play with it. I'm trying Pseudo-distributed setup, and I follow the instructions at http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html#Execution I'm stuck in step 5, then is when I run the command

bin/hdfs dfs -put etc/hadoop input 

I get the following error.

 15/02/02 00:35:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable put: `input': No such file or directory 

Why am I getting this error? How can I solve it?

+6
source share
7 answers

In addition to what Ashrit wrote, -p can also be added, just in case the directory has not yet been created.

 bin/hadoop fs -mkdir -p /path/to/hdfs/dir 

Hope this helps someone else.

+7
source

You get an error because such a directory is not listed in the path. Please see my answer to a similar question that explains how hasoop interprets the relative path.

First create a directory using:

 bin/hadoop fs -mkdir input 

and then try the -put command.

+3
source

Just put a "/" infront of input as it is a directory.

 ./bin/hdfs dfs -put etc/hadoop /input 

hope this helps

+1
source

There are two parts to the question above:

  • Showing a warning that U should use 64-bit and Hadoop Native Lib compiled into 32 bits. This warning will not affect your code.
  • The second error is mainly because it cannot put the file in the input folder. U need to create a folder in hadoop using the hadoop mkdir command:

hadoop fs -mkdir / hadoopinput

OR [for new version]

hdfs dfs -mkdir / hadoopinput

Now U can put the file in the folder:

hdfs dfs -put / Users / {username} / Desktop / file01 / hadoopinput

To verify that the file is copied inside the folder or does not use the following command:

hdfs dfs -ls / hadoopinput

0
source

SOLVING: 1. Make your directory in hdfs hdfs dfs -mkdir / input_file_name 2. Copy the data to hdf. hasoop fs -put filename.txt / node_name / directory_name

0
source

There are two errors, the first of which is the own hadoop library for your platform. This is because you did not install winupils for your hadoop version. See this answer fooobar.com/questions/981841 / ... for more details. The second error is not such a file or directory. This is because you must specify the path correctly. Change the directory to hadoop / bin / and write

Make directory

hdfs dfs -mkdir / input

To put a file in a directory

hdfs dfs -put / path / to / file.txt / input

To check a file in a directory

hdfs dfs -ls / input

0
source

change user: owner if you want to write any file from root to hdfs directly

 sudo -u hdfs hdfs dfs -chown root:hdfs /user/file --{/file} sudo -u hdfs hdfs dfs -chmod -R 775 /user/file 

Or

 sudo -u hdfs hdfs dfs -chown -R hdfs:hadoop /user/file sudo -u hdfs hdfs dfs -chmod -R 1777 /user/file 

then use put command

 sudo -u hdfs hdfs dfs -put /root/project/* /file --{/user/file} 

works for me

 [ root@spark ~]# sudo -u hdfs hdfs dfs -put /root/project/* /file/ put: 'file/': No such file or directory [ root@spark ~]# hdfs dfs -put /root/project/* /file put: Permission denied: user=root, access=WRITE, inode="/file":hdfs:hadoop:drwxr-xr-t [ root@spark ~]# sudo -u hdfs hdfs dfs -chown root:hdfs /file [ root@spark ~]# hdfs dfs -put /root/project/*.csv /file [ root@spark ~]# hdfs dfs -ls /file 

Found 12 products

 rw-r--r-- 1 root hdfs 4662272 2019-04-28 06:23 /file/StokKs.csv rw-r--r-- 1 root hdfs 302648 2019-04-28 06:23 /file/Stocks.csv rw-r--r-- 1 root hdfs 284628 2019-04-28 06:23 /file/Stocks.csv rw-r--r-- 1 root hdfs 568949 2019-04-28 06:23 /file/Satellite.csv rw-r--r-- 1 root hdfs 579302 2019-04-28 06:23 /file/Stocks.csv rw-r--r-- 1 root hdfs 24805721 2019-04-28 06:23 /file/medical.csv rw-r--r-- 1 root hdfs 5650234 2019-04-28 06:23 /file/bank.csv rw-r--r-- 1 root hdfs 2893092 2019-04-28 06:23 /file/facebook.csv 
0
source

Source: https://habr.com/ru/post/981838/


All Articles