SparkContext initialization error: master URL must be set in your configuration

I used this code

My mistake:

Using Spark default log4j profile: org/apache/spark/log4j-defaults.properties 17/02/03 20:39:24 INFO SparkContext: Running Spark version 2.1.0 17/02/03 20:39:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/02/03 20:39:25 WARN SparkConf: Detected deprecated memory fraction settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and storage memory management are unified. All memory fractions used in the old model are now deprecated and no longer read. If you wish to use the old memory management, you may explicitly enable `spark.memory.useLegacyMode` (not recommended). 17/02/03 20:39:25 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: A master URL must be set in your configuration at org.apache.spark.SparkContext.<init>(SparkContext.scala:379) at PCA$.main(PCA.scala:26) at PCA.main(PCA.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) 17/02/03 20:39:25 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration at org.apache.spark.SparkContext.<init>(SparkContext.scala:379) at PCA$.main(PCA.scala:26) at PCA.main(PCA.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) Process finished with exit code 1 
+5
source share
3 answers

If you use intrinsic safety, then

 val conf = new SparkConf().setMaster("spark://master") //missing 

and you can pass the parameter while sending the job

 spark-submit --master spark://master 

If you use a local spark, then

 val conf = new SparkConf().setMaster("local[2]") //missing 

you can pass the parameter while sending the job

 spark-submit --master local 

if you use a spark on yarn then

 spark-submit --master yarn 
+5
source

The error message is pretty clear, you must specify the address of the Spark Master node either through SparkContext or through spark-submit :

 val conf = new SparkConf() .setAppName("ClusterScore") .setMaster("spark://172.1.1.1:7077") // <--- This is what missing .set("spark.storage.memoryFraction", "1") val sc = new SparkContext(conf) 
+4
source
  SparkConf configuration = new SparkConf() .setAppName("Your Application Name") .setMaster("local"); val sc = new SparkContext(conf); 

He will work ...

+1
source

Source: https://habr.com/ru/post/1263801/


All Articles