I read Spark's book:
Driver programs launch Spark through the SparkContext object, which is a connection to a computing cluster. In the shell, a SparkContext is automatically created for you as a sc variable. Try printing sc to see its type
subcutaneously
When I enter sc, it gives me an error 20 sc value not found. Any idea why sc is not being created automatically in my scala shell?
I am trying to manually create sc and this gave me an error saying that the JVM already has a spark context. See Figure:
http://s30.photobucket.com/user/kctestingeas1/media/No%20Spark%20Context.jpg.html
I believe that already in scala there is a spark shell, as you can see at the top of my cmd window indicating bin \ spark-shell
Please inform. Thanks
source share