Map Reduce client banks for 2.4.1 hadoop in eclipse

When I run my hadoop mapreduce word count jar folder in hadoop in the shell, it works correctly and the output is generated correctly,

Since I use yarn in case of hadoop 2.4.1 , when I start from eclipse for the MapReduce Sample program , the MAP process terminates and becomes unsuccessful in the reduction process.

Clearly, the problem is with the jar configuration.

Please find banks, I added ...

enter image description here

This is the error I received

INFO: complete the task. November 21, 2014 8:50:35 p.m. org.apache.hadoop.mapred.LocalJobRunner $ Starting work WARNING: job_local1638918104_0001 java.lang.Exception: java.lang.NoSuchMethodError: org.apache.hadoop.mapred.ReduceTask.setLocalMavaFiles / Map;) V, in org.apache.hadoop.mapred.LocalJobRunner $ Job.runTasks (LocalJobRunner.java:462) in org.apache.hadoop.mapred.LocalJobRunner $ Job.run (LocalJobRunner.java UP29) Reason: java.lang.NoSuchMethodError: org.apache.hadoop.mapred.ReduceTask.setLocalMapFiles (Ljava / Util / Map;) V, in org.apache.hadoop.mapred.LocalJobRunner $ Work $ ReduceTaskRunnable.run (LocalJobRunner.java in java.util.concurrent.Executors $ RunnableAdapter.call (Executors.java:471) in java.util.concurrent.FutureTask $ Sync.innerRun (FutureTask.javahaps34) in java.util.concurrent.FutureTask.run (FutureTask .java: 166) in java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1145) in java.util.concurrent. ThreadPoolExecutor $ Worker.run (ThreadPoolExecutor.java:615) in java.lang.Thread.run (Thread.java:722)

Exception in thread "Thread-12" java.lang.NoClassDefFoundError: org / apache / commons / httpclient / HttpMethod at org.apache.hadoop.mapred.LocalJobRunner $ Job.run (LocalJobRunner.java∗62) Called: java.lang. ClassNotFoundException: org.apache.commons.httpclient.HttpMethod at java.net.URLClassLoader $ 1.run (URLClassLoader.javahaps66) in java.net.URLClassLoader $ 1.run (URLClassLoader.javahaps55) in java.security.AccessContler. doPrivileged (native method) in java.net.URLClassLoader.findClass (URLClassLoader.javahaps54) in java.lang.ClassLoader.loadClass (ClassLoader.java:423) in sun.misc.Launcher $ AppClassLoader.loadClass (Launcher.java. 308) at java.lang.ClassLoader.loadClass (ClassLoader.javahaps56) ... 1 more

0
source share
1 answer

According to the screenshot, you manually add all the dependent jars to the class path. Using maven for this is highly recommended , which automates the process of adding dependent jars to the classpath. We just need to add the main dependent jars.
I used the following dependencies in pom.xml which helped me to run without any problems.

 <properties> <hadoop.version>2.5.2</hadoop.version> </properties> <dependencies> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-core</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-yarn-api</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-yarn-common</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-auth</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-yarn-server-nodemanager</artifactId> <version>${hadoop.version}</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-yarn-server-resourcemanager</artifactId> <version>${hadoop.version}</version> </dependency> </dependencies> 

come to your problem, I checked in the classpath, there are exactly 82 jar files available.
It will be a tedious job to find every jar like that. You can add functional wise cans HERE .
Another workaround would be to add all the jar files to the installed hadoop directory path as <hadoop-installed>/share/hadoop/ and add all the jars from the entire lib folder. this is the best you can do .. or
Add only special avro attributes because the exception is thrown by the avro class according to the screenshot. . This may solve the problem with avro jars. but you may encounter other addiction problems. I also encountered the same problem when working with Hadoop V1. So later I realized and used Maven with Hadoop V2. So no worries about dependent jars. Your focus will be on Hadoop and Business. :)
Hope this helps you.

0
source

Source: https://habr.com/ru/post/1206385/


All Articles