I start work with a spark stream, and when I set the application name (a more readable line) for my spark stream job, it does not appear in the user interface of Hadoop applications. I always see the class name as a name in the Hadoop interface
val sparkConf = new SparkConf().setAppName("BetterName")
How to set a job name in Spark, so it appears in this Hadoop interface?
Hadoop URL for running applications - http: // localhost: 8088 / cluster / apps / RUNNING
[update]
It seems that this is only a problem with Spark Streaming jobs, I could not find a solution on how to fix this.
source
share