How to use two versions of the spark shell?

I have Spark 1.6.2 and Spark 2.0 installed on my hortonworks cluster.

Both of these versions are installed on a node in a 5-node Hadoop cluster.

Every time I run spark-shell, I get:

$ spark-shell
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default

When I check the version, I get:

scala> sc.version
res0: String = 1.6.2

How to run another version (Spark2.0 spark shell)?

+9
source share
5 answers
export SPARK_MAJOR_VERSION=2 

You just need to give the main version 2 or 1.

$ export SPARK_MAJOR_VERSION=2
$ spark-submit --version
SPARK_MAJOR_VERSION is set to 2, using Spark2
Welcome to
   ____              __
  / __/__  ___ _____/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 2.0.0.2.5.0.0-1245
+15
source

Using this approach:

spark-shell

goods Spark 1.6

when entering text

spark2-shell

cargo Spark 2.0

+2
source
$ SPARK_MAJOR_VERSION=2 spark-shell
+1

, SparkMajorVerison :

:

export SPARK_MAJOR_VERSION=2.0.0

And then launch the spark shell! That way, you can change it accordingly to suit your needs.

0
source

use spark2-submit, pyspark2 or spark2-shell

0
source

Source: https://habr.com/ru/post/1663704/


All Articles