Failed to start wizard for Spark on Windows

The same problem as Failed to start the wizard for spark in windows 10 , which is also not resolved.

My spark works well by checking pyspark.cmd and spark-shell.cmd

After starting .\sbin\start-master.sh I got:

 ps: unknown option -- o Try 'ps --help' for more information. starting org.apache.spark.deploy.master.Master, logging to C:\spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out ps: unknown option -- o Try 'ps --help' for more information. failed to launch org.apache.spark.deploy.master.Master: ======================================== Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M full log in C:\spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out 

I tried to visit the web interface, and localhost: 4040 works with the local host: 8080 cannot be reached.

And I found that there is a .log file created in the% SPARK_HOME% / logs folder. They contain the same content:

Spark command:

 C:\Program Files\Java\jdk1.7.0_79\bin\java -cp C:\spark-1.6.1-bin-hadoop2.6/conf\;C:\spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-api-jdo-3.2.6.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-core-3.2.10.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip hahaha-PC --port 7077 --webui-port 8080 ======================================== Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M 

Processing medium: Spark: 1.6.1 Windows 10

We look forward to your reply and thanks for your time!

+5
source share
4 answers

Just found the answer here: https://spark.apache.org/docs/1.2.0/spark-standalone.html

"Note: startup scripts are not currently supported by Windows. To start the Spark cluster on Windows, start the wizard and workers manually."

+5
source

Launch scripts located in %SPARK_HOME%\sbin do not support Windows. You need to manually start the wizard and worker, as described below.

  • Go to the %SPARK_HOME%\bin on the command line

  • Run spark-class org.apache.spark.deploy.master.Master to start the wizard. This will give you the url of the form spark://ip:port

  • Run spark-class org.apache.spark.deploy.worker.Worker spark://ip:port to start the worker. Make sure that you are using the URL obtained in step 2.

  • Run spark-shell --master spark://ip:port to connect the application to the newly created cluster.

+27
source

After executing the spark class org.apache.spark.deploy.master.Master just go to http: // localhost: 8080 to get the ip: port, and then open another shell to execute the spark class org.apache.spark.deploy.worker .Worker spark: // IP: PORT

+1
source

a little trick should help. I changed the JAVA_HOME path to the DOS version: for example, c: \ Progra ~ 1 \ Java \ jre1.8.0_131, and then rebooted. After that, I was able to run the org.apache ... command for the spark class mentioned above.

0
source

Source: https://habr.com/ru/post/1247020/


All Articles