Why does the Spark error server not show completed applications in local cluster mode?

I am testing a spark program in local cluster mode: I set the spark.home property to $ SPARK_HOME (pointing to the spark installation directory). And I started the spark history server using spark.history.fs.logDirectory pointing to the file: / tmp / spark-events. however, when I successfully completed the test and left. The history server does not show anything, the / tmp / spark -events folder is empty. How to make the spark history server recognize my spark program running in local cluster mode?

+4
source share
1 answer

Enable event collection in your Spark tests using the spark.eventLog.enabledSpark property and start over.

+3
source

Source: https://habr.com/ru/post/1612152/


All Articles