How to implement ScalaTest FunSuite to avoid spark and import code template

I am trying to reorganize the ScalaTest FunSuite test to avoid the template code for initializing and destroying a Spark session.

The problem is that I need implicit import functions, but using the before / after approach you can use only variables (var fields), and you need a value (val fields) to import.

The idea is for each new Spark session to complete each test run.

I am trying to do something like this:

import org.apache.spark.SparkContext import org.apache.spark.sql.{SQLContext, SparkSession} import org.scalatest.{BeforeAndAfter, FunSuite} object SimpleWithBeforeTest extends FunSuite with BeforeAndAfter { var spark: SparkSession = _ var sc: SparkContext = _ implicit var sqlContext: SQLContext = _ before { spark = SparkSession.builder .master("local") .appName("Spark session for testing") .getOrCreate() sc = spark.sparkContext sqlContext = spark.sqlContext } after { spark.sparkContext.stop() } test("Import implicits inside the test 1") { import sqlContext.implicits._ // Here other stuff } test("Import implicits inside the test 2") { import sqlContext.implicits._ // Here other stuff } 

But in the import sqlContext.implicits._ line, I have an error

Cannot resolve sqlContext character

How to solve this problem or how to implement a class of tests?

+6
source share
1 answer

Define a new immutable variable for the spark context and assign it var before importing implicits.

 class MyCassTest extends FlatSpec with BeforeAndAfter { var spark: SparkSession = _ before { val sparkConf: SparkConf = new SparkConf() spark = SparkSession. builder(). config(sparkConf). master("local[*]"). getOrCreate() } after { spark.stop() } "myFunction()" should "return 1.0 blab bla bla" in { val sc = spark import sc.implicits._ // assert ... } } 
+1
source

Source: https://habr.com/ru/post/1013992/


All Articles