Why does the Spark application feed to Mesos fail with the error "Could not parse Master URL:" mesos: // localhost: 5050 "?

I get the following exception when I try to send a Spark application to a Mesos cluster:

17/01/31 17:04:21 WARN NativeCodeLoader: it is not possible to load the native-hadoop library for your platform ... using the built-in Java classes, where applicable 17/01/31 17:04:22 ERROR SparkContext: SparkContext initialization failed. org.apache.spark.SparkException: failed to parse the main URL: 'mesos: // localhost: 5050' at org.apache.spark.SparkContext $ .org $ apache $ spark $ SparkContext $$ createTaskScheduler (SparkContext.scala: 2550) at org.apache.spark.SparkContext. (SparkContext.scala: 501)

+4
source share
1 answer

You probably used the wrong command to build Spark, for example, without -Pmesos. You must build it using ./build/mvn -Pmesos -DskipTests clean packageSpark 2.1.0.

+4
source

Source: https://habr.com/ru/post/1668454/


All Articles