update configuration in Spark 2.3.1
To change the default spark settings, you can do the following:
Import the required classes
from pyspark.conf import SparkConf from pyspark.sql import SparkSession
Get default settings
spark.sparkContext._conf.getAll()
Update default configurations
conf = spark.sparkContext._conf.setAll([('spark.executor.memory', '4g'), ('spark.app.name', 'Spark Updated Conf'), ('spark.executor.cores', '4'), ('spark.cores.max', '4'), ('spark.driver.memory','4g')])
Stop current Spark session
spark.sparkContext.stop()
Create Spark Session
spark = SparkSession.builder.config(conf=conf).getOrCreate()
source share