How to check the number of cores that Spark uses?

I have spark.cores.max set to 24 [3 work nodes], but if I get into my work node and see that there is only one process [command = Java] that consumes memory and processor. I suspect that it does not use all 8 cores (on m2.4x large ).

How to find out the number?

+4
source share
1 answer

You can see the number of cores occupied for each worker in a cluster in the Spark Web interface: Spark Web UI

+3
source

Source: https://habr.com/ru/post/985154/


All Articles