I install Apache Spark , which uses its own copy of SBT to configure.
I am using Linux Mint in a virtual virtual machine.
Here is an error snippet when I run sudo ./sbt/sbt compilefrom the Spark directory spark-0.9.0-incubating:
[error] (core/compile:compile) java.io.IOException: Cannot run program "javac": error=2, No such file or directory
[error] Total time: 181 s, completed Mar 9, 2014 12:48:03 PM
I can run javaand javacthe command line just fine example. javac -versiongivesjavac 1.6.0_31
The correct one jdk1.6.0_31/binis in mine PATH.
I read that the error may be related to the 64-bit JDK that I installed, but I get the same error with the 32-bit JDK.
How can i solve the problem?
edit: Using the bash shell.