I am running tests in Scala using Spark, creating a SparkContext as follows:
val conf = new SparkConf().setMaster("local").setAppName("test") val sc = new SparkContext(conf)
After the first execution, there was no error. But now I get this message (and error message):
Only one SparkContext may be running in this JVM (see SPARK-2243).
It seems like I need to check if there is any running SparkContext and stop it before starting a new one (I do not want to allow multiple contexts). How can i do this?
UPDATE:
I tried this, but there is one and the same error (I run tests from IntellijIdea, and I do the code before it executes):
val conf = new SparkConf().setMaster("local").setAppName("test") // also tried: .set("spark.driver.allowMultipleContexts", "true")
UPDATE 2:
class TestApp extends SparkFunSuite with TestSuiteBase { // use longer wait time to ensure job completion override def maxWaitTimeMillis: Int = 20000 System.clearProperty("spark.driver.port") System.clearProperty("spark.hostPort") var ssc: StreamingContext = _ val config: SparkConf = new SparkConf().setMaster("local").setAppName("test") .set("spark.driver.allowMultipleContexts", "true") val sc: SparkContext = new SparkContext(config) //... test("Test1") { sc.stop() } }
source share