How can I work with spark without installing Scala?

I downloaded Spark 1.2.0 (pre-built for Hadoop 2.4). Its quick start doc says:

It is available in either Scala or Python.

What bothers me is that my computer does not have Scala installed separately before (OS X 10.10), but when I type spark-shell, it works well, and the output shows:

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_25)

as shown in the screenshot:

enter image description here

I have not previously installed the Scala distribution.

How can a spark shell work without Scala?

+4
source share
2 answers

tl; dr Scala binaries are already included in Spark (to make it easier for Spark users).

Spark , Spark:

Spark Windows, UNIX- (, Linux, Mac OS). - , , java PATH JAVA_HOME Java.

Spark Java 6+ Python 2.6+. API Scala, Spark 1.2.0 Scala 2.10. Scala(2.10.x).

+4

Scala , , - Java, Java (JVM). , โ€‹โ€‹JVM, java, , Spark, Scala.

0

Source: https://habr.com/ru/post/1568525/


All Articles