I have a request regarding the creation of multiple spark sessions in one JVM. I read that creating multiple contexts is not recommended in earlier versions of Spark. Is this also true with SparkSession in Spark 2.0.
I am going to make a call to a web service or servlet from the user interface, and the service creates a spark session, performs some operation, and returns the result. This will create a spark session for each request from the client. Is this practice recommended?
Let's say I have a method like:
public void runSpark () throws Exception {
SparkSession spark = SparkSession .builder() .master("spark://<masterURL>") .appName("JavaWordCount") .getOrCreate();
etc....
If I put this method in a web service, will there be any problems with the JVM? That way, I can use this method several times from the main method. But not sure if this is a good practice.
source share