I am running Spark 1.4.1 with Scala 2.11 offline in my local field. I have the following ...
object Parser {
def main(args: Array[String]) {
if (args.length < 6) {
System.err.println("Usage: my.Parser <host> <input_loc>
<input_dt> <match_term> <out_loc><file_type>")
System.exit(1)
}
println(" *** Starting summariziation process *** ")
var host : String = args(0)
var inploc : String = args(1)
val inpdate : String = args(2)
val matchTerm : String = args(3)
val outloc : String = args(4)
val fileType : String = args(5)
println(" <------------------------------------------- debug ::0.0 ")
val typesMap = Map("data" -> "data", "rider" -> "mon", "sms" -> "sms", "voice" -> "rec", "voucher" -> "vou")
println( " typesMap - " + typesMap)
.........
}
}
When I run this code through the spark shell, it works fine. But by running it through spark-submit as a class object, I get weird behavior. I get the following error:
*** Starting summariziation process ***
<------------------------------------------------- debug ::0.0
Exception in thread "main" java.lang.NoSuchMethodError:
scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at my.Parser$.main(Parser.scala:138)
All I want is a simple search to get the file types to process.
It seems the line in which I am creating the map is giving an error. I am really obsessed with why it works in a spark shell and gives an error using spark-submit.
Has anyone encountered this problem? Can anyone suggest how I can fix this? Thanks for your help, in advance!
source
share