Increase Spark Memory When Using Local [*]

How to increase Spark memory when using local [*]?

I tried to configure the memory as follows:

val conf = new SparkConf() .set("spark.executor.memory", "1g") .set("spark.driver.memory", "4g") .setMaster("local[*]") .setAppName("MyApp") 

But I still get:

 MemoryStore: MemoryStore started with capacity 524.1 MB 

Related to this:

 .setMaster("local[*]") 
+7
source share
9 answers

I was able to solve this by running SBT with

 sbt -mem 4096 

However, the MemoryStore is half the size. Still browse where this faction is located.

+6
source

Assuming you are using a spark shell .. installing spark.driver.memory in your application does not work because your driver process has already started with default memory.

You can start your spark shell using:

 ./bin/spark-shell --driver-memory 4g 

or you can set it to spark-defaults.conf:

 spark.driver.memory 4g 

If you run the application using spark-submit, you must specify the driver memory as an argument:

 ./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar 
+6
source

in spark 2.x you can use SparkSession, which looks like this:

  val spark= new SparkSession() .config("spark.executor.memory", "1g") .config("spark.driver.memory", "4g") .setMaster("local[*]") .setAppName("MyApp") 
+2
source

The share of the heap used for Spark cache is 0.6 by default, so if you need more than 524.1 MB, you should increase the spark.executor.memory setting :)

Technically, you can also increase the share used for the Spark cache, but I believe that this is not recommended or at least requires additional configuration from you. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.

+1
source

Tried --driver-memory 4g , --executor-memory 4g , did not work to increase working memory. However, I noticed that bin/spark-submit collecting _JAVA_OPTIONS by setting it to -Xmx4g . I am using jdk7

+1
source

You cannot change the driver memory after the link to launch the application.

+1
source

Version

spark 2.3.1

Source

org.apache.spark.launcher.SparkSubmitCommandBuilder: 267

 String memory = firstNonEmpty(tsMemory, config.get(SparkLauncher.DRIVER_MEMORY), System.getenv("SPARK_DRIVER_MEMORY"), System.getenv("SPARK_MEM"), DEFAULT_MEM); cmd.add("-Xmx" + memory); 
  1. SparkLauncher.DRIVER_MEMORY

- 2g memory driver

  1. SPARK_DRIVER_MEMORY

vim conf / spark-env.sh

SPARK_DRIVER_MEMORY = "2g"

  1. SPARK_MEM

vim conf / spark-env.sh

SPARK_MEM = "2g"

  1. DEFAULT_MEM

1g

0
source

To assign memory to Spark:

in the command shell: / usr / lib / spark / bin / spark-shell --driver-memory = 16G --num-executors = 100 --executor-cores = 8 --executor-memory = 16G

-1
source
 /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G 
-2
source

Source: https://habr.com/ru/post/1231925/


All Articles