Creating a Spark app using the wrong version of Scala

I follow the instructions here: https://spark.apache.org/docs/latest/quick-start.html to create a simple application that will run on a local standalone Spark assembly.

On my system, I have Scala 2.9.2 and sbt 0.13.7. When I write in my simple.sbtfollowing:

scalaVersion := "2.9.2"

after use sbt package, I get an error message: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.9.2;1.3.1: not found

However, when I write to simple.sbt:

scalaVersion := "2.10.4"

sbt works successfully, and the application works fine on Spark.

How can this happen since I do not have Scala 2.10.4 on my system?

+4
source share
2 answers

Scala , , Java. , Scala scalac Java. , Scala, , , .

, sbt Scala (2.9.2), () Scala (2.10.x) , -classpath.

: java Scala ?

+10

@noahlz, Scala , sbt .

, , , Scala 2.9.2 spark-core version 1.3.1.

, Maven Central ( spark-core), spark-core Scala 2.10 2.11.

:

scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"

- , , Scala 2.10.5:

scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
+7

Source: https://habr.com/ru/post/1584933/


All Articles