Multiple spark workers on the same Windows machine

I am trying to teach myself Spark through Scala using Intellij on Windows. I do this on the same machine, and I would like to run several employees on the same machine to simulate a cluster. I read this page that says that

"Launch scripts do not currently support Windows. To start the Spark cluster on Windows, run the wizard and workers manually."

I do not know what it means to start the craftsmen and workers manually. Can anyone help? Thanks so much for any help / suggestions.

+4
source share
1 answer

Spark Master , % SPARK_HOME%\bin

spark-class org.apache.spark.deploy.master.Master

URL-, spark://ip: port
: 8080

Spark Worker,

spark-class org.apache.spark.deploy.worker.Worker spark://ip:port

, -, , "".
, .

+3

Source: https://habr.com/ru/post/1627663/


All Articles