Flink: scala version conflict?

I am trying to compile a kafka sample from here in IntelliJ. After many dependency problems have encountered this problem, I can’t get past:

15/10/25 12:36:34 ERROR actor.ActorSystemImpl: Uncaught fatal error from thread [flink-akka.actor.default-dispatcher-4] shutting down ActorSystem [flink] java.lang.NoClassDefFoundError: scala/runtime/AbstractPartialFunction$mcVL$sp at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:760) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:455) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:367) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.flink.runtime.jobmanager.MemoryArchivist.handleMessage(MemoryArchivist.scala:80) at org.apache.flink.runtime.FlinkActor$class.receive(FlinkActor.scala:32) at org.apache.flink.runtime.jobmanager.MemoryArchivist.org$apache$flink$runtime$LogMessages$$super$receive(MemoryArchivist.scala:59) at org.apache.flink.runtime.LogMessages$class.receive(LogMessages.scala:26) at org.apache.flink.runtime.jobmanager.MemoryArchivist.receive(MemoryArchivist.scala:59) at akka.actor.ActorCell.newActor(ActorCell.scala:567) at akka.actor.ActorCell.create(ActorCell.scala:587) at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:460) at akka.actor.ActorCell.systemInvoke(ActorCell.scala:482) at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:282) at akka.dispatch.Mailbox.run(Mailbox.scala:223) at akka.dispatch.Mailbox.exec(Mailbox.scala:234) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Caused by: java.lang.ClassNotFoundException: scala.runtime.AbstractPartialFunction$mcVL$sp at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 28 more 

I came across several concepts that suggest that this is a problem with the scala version. List of current library:

 flink-runtime-1.0-SNAPSHOT flink-streaming-java-1.0-SNAPSHOT flink-connector-kafka-1.0-SNAPSHOT flink-java8-1.0-SNAPSHOT flink-core-1.0-SNAPSHOT flink-java-1.0-SNAPSHOT org.apache.hadoop:hadoop-core:1.2.1 flink-clients-1.0-SNAPSHOT org.apache.kafka:kafka-clients:0.8.2.2 org.apache.kafka:kafka_2.11:0.8.2.2 flink-optimizer-1.0-SNAPSHOT org.apache.sling:org.apache.sling.commons.json:2.0.6 de.javakaffee:kryo-serializers:0.28 com.github.scopt:scopt_2.11:3.3.0 org.clapper:grizzled-slf4j_2.9.0:0.6.6 com.typesafe.akka:akka-osgi_2.11:2.4.0 com.typesafe.akka:akka-slf4j_2.11:2.4.0 

Suggestions about where I got confused?

+5
source share
1 answer

The problem is really a mismatch of the Scala version. You mix the dependencies that are built for Scala 2.11 , for example. org.apache.kafka:kafka_2.11:0.8.2.2 with Flink dependencies, which are built by default for Scala 2.10 .

One of the dependencies built for Scala 2.11 pulls in the can of scala-library:2.11 , which replaces the scala-library:2.10 dependency needed for the Flink dependencies. You either use binaries created for Scala 2.10 for dependencies other than Flink, or you create and install Flink using Scala 2.11 . See https://ci.apache.org/projects/flink/flink-docs-master/setup/building.html#build-flink-for-a-specific-scala-version for creating Flink with a different version of Scala.

Kafka example

If you just want the version of the Kafka example above to be used for 0.10-SNAPSHOT , you need to change the Flink version in the pom.xml file, and you should use the FlinkKafkaProducer instead of KafkaSink in WriteIntoKafka.java . You will no longer need SimpleStringSchema . This is all you need to change (no additional dependencies are required).

+5
source

Source: https://habr.com/ru/post/1234488/


All Articles