I got an exception when sending a mapreduce job from a remote system
10/13/28 6:49:52 PM ERROR security.UserGroupInformation: PriviledgedActionException as: root cause: org.apache.hadoop.mapred.InvalidInputException: input path does not exist: file: / F: / Workspaces / Test / Hadoop / test
My hadoop and mapreduce envirnment are configured on a Linux machine. I am sending a wordcount job from a local Windows PC as follows:
public static void main(String[] args) throws Exception {
UserGroupInformation ugi = UserGroupInformation.createRemoteUser("root");
try {
ugi.doAs(new PrivilegedExceptionAction<Void>() {
public Void run() throws Exception {
JobConf conf = new JobConf(MapReduce.class);
conf.set("mapred.job.name", "MyApp");
conf.set("mapred.job.tracker", "192.168.1.149:9001");
conf.set("fs.default.name","hdfs://192.168.1.149:9000");
conf.set("hadoop.job.ugi", "root");
conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(IntWritable.class);
conf.setMapperClass(Map.class);
conf.setCombinerClass(Reduce.class);
conf.setReducerClass(Reduce.class);
conf.setInputFormat(TextInputFormat.class);
conf.setOutputFormat(TextOutputFormat.class);
FileInputFormat.setInputPaths(conf, new Path("test"));
FileOutputFormat.setOutputPath(conf, new Path("test"));
JobClient.runJob(conf);
return null;
}
});
} catch (Exception e) {
e.printStackTrace();
}
}
where 192.168.1.149 is the configured linux pc. I started using the services of on-line, mapreduce. A test directory was created with the same Java API. But mapreduce is not.
** Please, help.. **