The exception is a bit misleading; there is no real relative path being processed, the problem here is that the Hadoop "Path" does not support ":" in file names. In your case, “rsrc: hbase-common-0.98.1-hadoop2.jar” is interpreted as “rsrc” as “schema”, whereas I suspect that you really intended to add a resource file: /// path / to / your / jarfile / rsrc: hbase-common-0.98.1-hadoop2.jar ". Here the old JIRA discusses the illegal nature:
https://issues.apache.org/jira/browse/HADOOP-3257
Note that you probably won’t be able to use this absolute path either, since the file still has ::. You can try to escape the file name, for example, "rsrc% 3Ahbase-common-0.98.1-hadoop2.jar", but then it may not be found correctly at the other end where it is used.
The best way to fix this is to solve the root cause of the implementation of "rsrc: hbase-common-0.98.1-hadoop2.jar" - using Eclipse to create your runnable jar is one of the probable causes of the problem. If possible, try building your own jar using something other than Eclipse and see if the same problem occurs; You can also try to select "Package required libraries in the generated jar" when creating a flag in Eclipse.
If uber-jar turns out to be too large, you can also try to place the original dependency banks, such as hbase-common-0.98.1-hadoop2.jar, in the classpath on all your nodes along with any other dependencies you may need, and then skip the call to "TableMapReduceUtil.addHBaseDependencyJars (conf);".
Here is the old thread of another user, faced with a similar problem that you see:
http://lucene.472066.n3.nabble.com/Error-while-running-MapR-program-on-multinode-configuration-td4053610.html
source share