Error using Hadoop MapReduce in Eclipse

When I ran MapReduce in Eclipse using Hadoop , I got the following error.
It must be some kind of change along the way, but I cannot understand it.
Any idea?

16:35:39 INFO mapred.JobClient: Task Id : attempt_201001151609_0001_m_000006_0, Status : FAILED
java.io.FileNotFoundException: File C:/tmp/hadoop-Shwe/mapred/local/taskTracker/jobcache/job_201001151609_0001/attempt_201001151609_0001_m_000006_0/work/tmp does not exist.
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
    at org.apache.hadoop.mapred.TaskRunner.setupWorkDir(TaskRunner.java:519)
    at org.apache.hadoop.mapred.Child.main(Child.java:155)
+3
source share
3 answers

Given the error message ( [...]6_0/work/tmp does not exist), the first questions to check are:

Extract:

, MapReduce Map/Reduce. Window > Open Perspective Show View .

  • Map/Reduce. .
  • . .
  • Map/Reduce Master DFS Master. conf/hadoop- site.xml "mapred.job.tracker" "dfs.default.name" . , , , hadoop-default.xml hadoop-env.xml.
  • "Advanced Parameters" "mapred.job.tracker". - "".
  • , hasoop. , "hadoop" hadoop.
  • , , .
0

core-site.xml hdfs-site.xml, , . hdfs://localhost: [] :///

0

Usually, if you use cdh 5, cloudera quickstart VM, then it is 8021 and 8020 respectively, if you do not perform additional configuration.

0
source

Source: https://habr.com/ru/post/1728599/


All Articles