Hasoop java.net.URISyntaxException: Relative path in absolute URI: rsrc: hbase-common-0.98.1-hadoop2.jar

I have work on shrinking a card that connects to HBASE, and I cannot figure out where I came across this error:

Exception in thread "main" java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: rsrc:hbase-common-0.98.1-hadoop2.jar at org.apache.hadoop.fs.Path.initialize(Path.java:206) at org.apache.hadoop.fs.Path.<init>(Path.java:172) at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.findOrCreateJar(TableMapReduceUtil.java:703) at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:656) at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:573) at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:617) at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:398) at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:356) at com.ancestry.bigtree.hfile.JsonToHFileDriver.run(JsonToHFileDriver.java:117) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at com.ancestry.bigtree.hfile.JsonToHFileDriver.main(JsonToHFileDriver.java:69) ... 10 more Caused by: java.net.URISyntaxException: Relative path in absolute URI: rsrc:hbase-common-0.98.1-hadoop2.jar at java.net.URI.checkPath(URI.java:1804) at java.net.URI.<init>(URI.java:752) at org.apache.hadoop.fs.Path.initialize(Path.java:203) 

If I do not have Hbase libraries, the work is fine. Where is the relative path generated? How can I make the created paths be absolute?

In my code, I have two lines:

TableMapReduceUtil.addHBaseDependencyJars (Conf); HFileOutputFormat2.configureIncrementalLoad (job, htable);

If I delete them, I am fine, but the work does not do what I need for this. I ended up trying to create an HFILE for use with the hbase bootloader.

Environment: HBase 0.96.1.2.0.10.0-1-hadoop2 Hadoop 2.2.0.2.0.10.0-1

Thanks in advance for any help or guidance.

+5
source share
1 answer

The exception is a bit misleading; there is no real relative path being processed, the problem here is that the Hadoop "Path" does not support ":" in file names. In your case, “rsrc: hbase-common-0.98.1-hadoop2.jar” is interpreted as “rsrc” as “schema”, whereas I suspect that you really intended to add a resource file: /// path / to / your / jarfile / rsrc: hbase-common-0.98.1-hadoop2.jar ". Here the old JIRA discusses the illegal nature:

https://issues.apache.org/jira/browse/HADOOP-3257

Note that you probably won’t be able to use this absolute path either, since the file still has ::. You can try to escape the file name, for example, "rsrc% 3Ahbase-common-0.98.1-hadoop2.jar", but then it may not be found correctly at the other end where it is used.

The best way to fix this is to solve the root cause of the implementation of "rsrc: hbase-common-0.98.1-hadoop2.jar" - using Eclipse to create your runnable jar is one of the probable causes of the problem. If possible, try building your own jar using something other than Eclipse and see if the same problem occurs; You can also try to select "Package required libraries in the generated jar" when creating a flag in Eclipse.

If uber-jar turns out to be too large, you can also try to place the original dependency banks, such as hbase-common-0.98.1-hadoop2.jar, in the classpath on all your nodes along with any other dependencies you may need, and then skip the call to "TableMapReduceUtil.addHBaseDependencyJars (conf);".

Here is the old thread of another user, faced with a similar problem that you see:

http://lucene.472066.n3.nabble.com/Error-while-running-MapR-program-on-multinode-configuration-td4053610.html

+14
source

Source: https://habr.com/ru/post/1200451/


All Articles