I'm new to Apache Spark, and I just found out that Spark supports three types of clusters:
- Autonomous - means Spark will manage its own cluster
- YARN - Using the Hadoop YARN Resource Manager
- Mesos - Apache Resource Management Project
Since I'm new to Spark, I think I should try Autonomous first . But I wonder which one is recommended. Say, in the future I need to build a large cluster (hundreds of instances), what type of cluster should I go to?
yarn apache-spark mesos apache-spark-standalone
davidshen84 Feb 22 '15 at 23:44 2015-02-22 23:44
source share