I get the following exception when I try to send a Spark application to a Mesos cluster:
17/01/31 17:04:21 WARN NativeCodeLoader: it is not possible to load the native-hadoop library for your platform ... using the built-in Java classes, where applicable 17/01/31 17:04:22 ERROR SparkContext: SparkContext initialization failed. org.apache.spark.SparkException: failed to parse the main URL: 'mesos: // localhost: 5050' at org.apache.spark.SparkContext $ .org $ apache $ spark $ SparkContext $$ createTaskScheduler (SparkContext.scala: 2550) at org.apache.spark.SparkContext. (SparkContext.scala: 501)
You probably used the wrong command to build Spark, for example, without -Pmesos. You must build it using ./build/mvn -Pmesos -DskipTests clean packageSpark 2.1.0.
-Pmesos
./build/mvn -Pmesos -DskipTests clean package
Source: https://habr.com/ru/post/1668454/More articles:Make auto-add column using Laravel migration - phpCan I transfer CloudKit data to a new container? - iosNodeJS Sequelize - Unable to read '_SSecelizeMethod' property from undefined - javascriptDot Net MemoryCache Eviction - c #Is there a way to unit test F # projects in a .net kernel? - unit-testingChange geom_bar width to ggplot - rThe data file in the Python module is extracted to where? - pythonvarying degrees of shuffling using random python module - pythonHow to generate passwords in Python 2 and Python 3? Safely? - pythonProblem with Angular 2 AoT (ngc), Visual Studio 2015 and compileOnSave - angularAll Articles