IOException: Can't start the "javac" program when "sudo./sbt/sbt compile" in Spark?

I install Apache Spark , which uses its own copy of SBT to configure.

I am using Linux Mint in a virtual virtual machine.

Here is an error snippet when I run sudo ./sbt/sbt compilefrom the Spark directory spark-0.9.0-incubating:

[error] (core/compile:compile) java.io.IOException: Cannot run program "javac": error=2, No such file or directory

[error] Total time: 181 s, completed Mar 9, 2014 12:48:03 PM

I can run javaand javacthe command line just fine example. javac -versiongivesjavac 1.6.0_31

The correct one jdk1.6.0_31/binis in mine PATH.

I read that the error may be related to the 64-bit JDK that I installed, but I get the same error with the 32-bit JDK.

How can i solve the problem?

edit: Using the bash shell.

+4
2

, , . , .

java javac , ? , , , , .

, sudo ./sbt/sbt compile root (- , sudo), , (), javac java.

jdk1.6.0_31/bin PATH root, ( Java).

JAVA_HOME jdk1.6.0_31, - Java.

./sbt/sbt PATH JAVA_HOME .

+3

javac. Ubuntu :

sudo apt-get install openjdk-7-jdk

path.

+2

Source: https://habr.com/ru/post/1530864/


All Articles