Get JavaSparkContext from SparkSession

I use SparkSession to run my spark application because I use many spark-sql functions. I would like to use JavaSparkContext to create an RDD from a list. But through the session, I can only get the usual SparkContext . Is there a way to transform the context in this direction?

+5
source share
2 answers

After defining the SparkContext you can use:

 SparkContext sc = ... JavaSparkContext jsc = JavaSparkContext.fromSparkContext(sc); 

This will return you a new instance of JavaSparkContext , but the problem does not occur if you support only one active instance of SparkContext .

+12
source

Yes, you can do this with a spark session as follows:

  val spark = SparkSession.builder() .config(sparkConf) .getOrCreate() val jsc = new JavaSparkContext(spark.sparkContext) 

or in java, it will be:

 SparkSession spark = SparkSession.builder().config(sparkConf).getOrCreate(); JavaSparkContext jsc = new JavaSparkContext(spark.sparkContext()); 
+4
source

Source: https://habr.com/ru/post/1264998/


All Articles