Why does the Scala compiler fail with the “SparkConf object in the spark lawsuit cannot be accessed in the org.apache.spark package”?

I can not access SparkConf in the package. But I already imported import org.apache.spark.SparkConf . My code is:

 import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf import org.apache.spark.rdd.RDD import org.apache.spark._ import org.apache.spark.streaming._ import org.apache.spark.streaming.StreamingContext._ object SparkStreaming { def main(arg: Array[String]) = { val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount") val ssc = new StreamingContext( conf, Seconds(1) ) val lines = ssc.socketTextStream("localhost", 9999) val words = lines.flatMap(_.split(" ")) val pairs_new = words.map( w => (w, 1) ) val wordsCount = pairs_new.reduceByKey(_ + _) wordsCount.print() ssc.start() // Start the computation ssc.awaitTermination() // Wait for the computation to the terminate } } 

sbt dependencies:

 name := "Spark Streaming" version := "1.0" scalaVersion := "2.10.4" libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "1.5.2" % "provided", "org.apache.spark" %% "spark-mllib" % "1.5.2", "org.apache.spark" %% "spark-streaming" % "1.5.2" ) 

But the error shows that SparkConf impossible to access.

 [error] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31: object SparkConf in package spark cannot be accessed in package org.apache.spark [error] val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount") [error] ^ 
+2
source share
2 answers

It compiles if you add parentheses after SparkConf:

val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")

The fact is that SparkConf is a class, not a function, so you can use the class name also for scope purposes. Therefore, when you add brackets after the class name, you make sure that you call the constructor of the class, not the scope function. Here is an example from the Scala shell illustrating the difference:

 scala> class C1 { var age = 0; def setAge(a:Int) = {age = a}} defined class C1 scala> new C1 res18: C1 = $iwC$$iwC$C1@2d33c200 scala> new C1() res19: C1 = $iwC$$iwC$C1@30822879 scala> new C1.setAge(30) // this doesn't work <console>:23: error: not found: value C1 new C1.setAge(30) ^ scala> new C1().setAge(30) // this works scala> 
+4
source

In this case, you cannot skip parentheses so that they are:

 val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount") 
+1
source

Source: https://habr.com/ru/post/1260874/


All Articles