Failed to import Spark Implicits into ScalaTest

I am writing Test Cases for Spark using ScalaTest.

import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterAll, FlatSpec}

class ClassNameSpec extends FlatSpec with BeforeAndAfterAll {
  var spark: SparkSession = _
  var className: ClassName = _

  override def beforeAll(): Unit = {
    spark = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
    className = new ClassName(spark)
  }

  it should "return data" in {
    import spark.implicits._
    val result = className.getData(input)

    assert(result.count() == 3)
  }

  override def afterAll(): Unit = {
    spark.stop()
  }
}

When I try to compile a test suite, it causes the following error:

stable identifier required, but ClassNameSpec.this.spark.implicits found.
[error]     import spark.implicits._
[error]                  ^
[error] one error found
[error] (test:compileIncremental) Compilation failed

I can’t understand why I can’t import spark.implicits._in the test suite.

Any help is appreciated!

+4
source share
1 answer

To import you need a “stable identifier” as indicated in the error message. This means that you need to have val, not var. Since you defined the spark as var, scala cannot import correctly.

To solve this problem, you can simply do something like:

val spark2 = spark
import spark2.implicits._

var val, :

lazy val spark: SparkSession = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
+15

Source: https://habr.com/ru/post/1677629/


All Articles