How to stop running SparkContext before opening a new one

I am running tests in Scala using Spark, creating a SparkContext as follows:

val conf = new SparkConf().setMaster("local").setAppName("test") val sc = new SparkContext(conf) 

After the first execution, there was no error. But now I get this message (and error message):

 Only one SparkContext may be running in this JVM (see SPARK-2243). 

It seems like I need to check if there is any running SparkContext and stop it before starting a new one (I do not want to allow multiple contexts). How can i do this?

UPDATE:

I tried this, but there is one and the same error (I run tests from IntellijIdea, and I do the code before it executes):

 val conf = new SparkConf().setMaster("local").setAppName("test") // also tried: .set("spark.driver.allowMultipleContexts", "true") 

UPDATE 2:

 class TestApp extends SparkFunSuite with TestSuiteBase { // use longer wait time to ensure job completion override def maxWaitTimeMillis: Int = 20000 System.clearProperty("spark.driver.port") System.clearProperty("spark.hostPort") var ssc: StreamingContext = _ val config: SparkConf = new SparkConf().setMaster("local").setAppName("test") .set("spark.driver.allowMultipleContexts", "true") val sc: SparkContext = new SparkContext(config) //... test("Test1") { sc.stop() } } 
+5
source share
1 answer

To stop an existing context , you can use the stop method for this instance of SparkContext .

 import org.apache.spark.{SparkContext, SparkConf} val conf: SparkConf = ??? val sc: SparkContext = new SparkContext(conf) ... sc.stop() 

To reuse an existing context or create a new one , you can use the SparkContex.getOrCreate method.

 val sc1 = SparkContext.getOrCreate(conf) ... val sc2 = SparkContext.getOrCreate(conf) 

When used in test suites, both methods can be used to achieve different goals:

+9
source

Source: https://habr.com/ru/post/1247883/


All Articles