I have Spark 1.6.2 and Spark 2.0 installed on my hortonworks cluster.
Both of these versions are installed on a node in a 5-node Hadoop cluster.
Every time I run spark-shell, I get:
$ spark-shell
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
When I check the version, I get:
scala> sc.version
res0: String = 1.6.2
How to run another version (Spark2.0 spark shell)?
source
share