Pass the file as a Spark command line argument

I code Spark's work in Scala and you need to send some argument via the command line in the JSON file format, for example, application name, wizard and some other variables.

./bin/spark-submit --name "My app" --master local[4] --conf spark.eventLog.enabled=false --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" myApp.jar

I need to send the application name, master and all arguments to a single JSON file, for example:

$SPARK_HOME/bin/spark-submit --properties-file  property.conf

Is it possible? How? Can someone explain a simple example?

+4
source share
1 answer

You can use the option --jarsas follows:

$SPARK_HOME/bin/spark-submit --jars property.conf --class your.Class your.jar

The help page spark-submitwill scream more:

$SPARK_HOME/bin/spark-submit --help

  --jars JARS Comma-separated list of local jars to include on the driver
              and executor classpaths.

, , .

+3

Source: https://habr.com/ru/post/1680864/


All Articles