Upgrading Spark to 1.5.1 exceptions at startup

I upgraded to Spark 1.5.1 and ran into problems when using RDD.map (). I get the following exception:

Exception in thread "main" java.lang.IllegalArgumentException
at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:44)
at org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:81)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:187)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2030)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:314)
at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:313)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
at org.apache.spark.rdd.RDD.map(RDD.scala:313)
at com.framedobjects.ClickInvestigation$.main(ClickInvestigation.scala:17)
at com.framedobjects.ClickInvestigation.main(ClickInvestigation.scala)

Error displaying RDD [String] in RDD [CounterRecord]:

val counterRDD = counterTextRDD.map(mapToCounter(_))

My build.sbt looks like

name := "exploring-spark"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" % "1.5.1" withSources,
                        "net.liftweb" %% "lift-json" % "2.6",
                        "org.scalatest" % "scalatest_2.11" % "2.2.4" % "test",
                        "joda-time" % "joda-time" % "2.8.2",
                        "org.yaml" % "snakeyaml" % "1.16",
                        "com.github.seratch" %% "awscala" % "0.3.+" withSources,
                        "org.apache.devicemap" % "devicemap-client" % "1.1.0",
                        "org.apache.devicemap" % "devicemap-data" % "1.0.3")

I feel that there are some inconsistencies in the version (ASM?), But I can’t pinpoint the problem. I compiled against Java 1.8 and run 1.8.0_40. Any ideas?

Further research shows that this is a problem with Eclipse (Mars) and Scala -IDE. I can run the code in spark shell v1.5.0.

+4
source share
2 answers

In my case, changing the scala compiler target to jvm 1.7 fixed this problem.

+2
source

Source: https://habr.com/ru/post/1610689/


All Articles