I originally had this configuration file:
my-app { environment: dev other: xxx }
This is how I load my configuration into my scala spark code:
val config = ConfigFactory.parseFile(File<"my-app.conf">) .withFallback(ConfigFactory.load()) .resolve .getConfig("my-app")
With this setting, even though the Configafe Config documentation and all the other answers say, overriding the system property did not work for me when I started my spark job as follows:
spark-submit \ --master yarn \ --deploy-mode cluster \ --name my-app \ --driver-java-options='-XX:MaxPermSize=256M -Dmy-app.environment=prod' \ --files my-app.conf \ my-app.jar
To make it work, I had to change my configuration file to:
my-app { environment: dev environment: ${?env.override} other: xxx }
and then run it like this:
spark-submit \ --master yarn \ --deploy-mode cluster \ --name my-app \ --driver-java-options='-XX:MaxPermSize=256M -Denv.override=prod' \ --files my-app.conf \ my-app.jar
Vijay Ratnagiri Dec 28 '17 at 17:03 2017-12-28 17:03
source share