Scala / Spark Version Compatibility

I am creating my first spark application.

http://spark.apache.org/downloads.html tells me that Spark 2.x is built against Scala 2.11.

On the Scala website https://www.scala-lang.org/download/all.html I see versions from 2.11.0 - 2.11.11

So, here is my question: what exactly does 2.11 mean on the Spark website. Is this any version of Scala in the range 2.11.0 - 2.11.11?

Another question: can I create my Spark applications using the latest version of Scala 2.12.2? I assume Scala is backward compatible, so Spark libraries built with Scala say that 2.11.x can be used / called in Scala 2.12.1 applications. Am I right?

+6
source share
1 answer

Scala is not backwards compatible as you assume. You should use scala 2.11 with spark light if you do not restore the spark in scala 2.12 (this is an option if you want to use the latest version of scala, but require more work for everything to work).

When considering compatibility, you need to consider both source compatibility and binary compatibility. scala tends to be backward compatible, so you can rebuild your jar in a newer version, but it is not compatible with binary feedback, so you cannot use a jar built with the old version with code from the new version.

, scala 2.10, 2.11, 2.12 .. - ( ). , , Scala 2.11 2.11.0 - 2.11.11 ( 2.11 )

, scala scala. , , , , , . SBT %% , maven . _2.10, _2.11 _2.12, scala, .

+15

Source: https://habr.com/ru/post/1017124/


All Articles