"unresolved dependency" for Spark 2.1.0 on SBT

version: = "1.0"
scalaVersion: = "2.11.8"
ivyScala: = ivyScala.value map {_.copy (overrideScalaVersion = true)}
libraryDependencies + = "org.apache.spark" %% "spark-core"% "2.1.0"

I'm trying to get a spark in my development environment when I try to build a jar using sbt, but it failed and showed [error] in my sbt, as shown below:

[warn] :::::::::::::::::::::::::::::::::::::::::::::: <br/> [warn] :: UNRESOLVED DEPENDENCIES :: <br/> [warn] :::::::::::::::::::::::::::::::::::::::::::::: <br/> [warn] :: org.apache.spark#spark-core_2.11;2.1.0: not found <br/> [warn] :::::::::::::::::::::::::::::::::::::::::::::: <br/> [warn] [warn] Note: Unresolved dependencies path: <br/> [warn] org.apache.spark:spark-core_2.11:2.1.0 (D:\MyDocument\IDEA\Scala\model\build.sbt#L9-10) <br/> [warn] +- org.apache.spark:spark-catalyst_2.11:2.1.0 <br/> [warn] +- org.apache.spark:spark-sql_2.11:2.1.0 (D:\MyDocument\IDEA\Scala\model\build.sbt#L15-16) <br/> [warn] +- org.apache.spark:spark-hive_2.11:2.1.0 (D:\MyDocument\IDEA\Scala\model\build.sbt#L11-12) <br/> [warn] +- default:producttagmodel_2.11:1.0 <br/> [trace] Stack trace suppressed: run 'last *:update' for the full output. <br/> [error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11;2.1.0: not found 

my version of IntelliJ is 2016.3.5, and the sbt version is 0.13.13, and scala is version 2.11.8; I found that sbt loaded the spark-core.jar, which I found in my .ivy / cache directory, but always showing "unknown artifact. Unresolved or indexed". I update my project index many times, but it did not work. I am creating a new project using the same build.sbt in case of IntelliJ cache violation, but it does not work. I am completely confused by this problem.
here is my build.sbt setup below:

enter image description here

+5
source share
1 answer

How to change the dependency:

 libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0" 

In addition, for a spark application this is usually added as "provided" , it should not be included in the bank, since when sending a task, the corresponding spark libraries are already installed in the drivers and artists.

0
source

Source: https://habr.com/ru/post/1266199/


All Articles