Why does the spark shell fail with “surprise at this time”?

When you run the spark-shell command, the following error appears:

enter image description here

I downloaded the spark-2.1.1-bin-hadoop2.7.tgz file from http://spark.apache.org/downloads.html , extracted the tar file and inserted the contents of the folder into the c: \ Spark directory. After that, I set up the environment variable for spark and jdk respectively, but I get this error. Any help would be appreciated.

+4
source share
5 answers

I am pretty sure that your environment variable JAVA_HOMEcontains a space that breaks spark-shell. Please reinstall Java in the directory with no spaces in the path.


bin/spark-class2.cmd, spark-shell Windows spark-shell ( bin/spark-submit2.cmd):

if "x%1"=="x" (

, spark-class2.cmd %1 ( - ), :

if "x"Files\Java\jdk1.8.0_45""=="x" (

- .

, JAVA_HOME . , , .

+5

, Java Windows

JAVA_HOME=C:\Program Files (x86)\Java\jdk1.8.0_162\bin

, "Program Files (x86)" 10

"C:\Program Files (x86)\Java\jdk1.8.0_162\bin" 

Java Program Files (x86),

JAVA_HOME=C:\java\jdk1.8.0_171\bin
+2

java, , C:\Program Files\Java.. . , JAVA_HOME , . , , C:\Java.. . , "C:\Program Files\Java.." , , . - , .

+1

java spark-env.sh " "

export JAVA_HOME=/usr/lib/jvm/java-8-oracle
0
source

I fixed this error using the abbreviation "Progra ~ 1", which is the abbreviation "Program Files (x86)".

C: \ Progra ~ 1 \ Java \ jdk1.8.0_161

0
source

Source: https://habr.com/ru/post/1017385/


All Articles