Spark Context is not created automatically in Scala Spark Shell

I read Spark's book:

Driver programs launch Spark through the SparkContext object, which is a connection to a computing cluster. In the shell, a SparkContext is automatically created for you as a sc variable. Try printing sc to see its type

subcutaneously

When I enter sc, it gives me an error 20 sc value not found. Any idea why sc is not being created automatically in my scala shell?

I am trying to manually create sc and this gave me an error saying that the JVM already has a spark context. See Figure:

http://s30.photobucket.com/user/kctestingeas1/media/No%20Spark%20Context.jpg.html

I believe that already in scala there is a spark shell, as you can see at the top of my cmd window indicating bin \ spark-shell

Please inform. Thanks

+6
source share
1 answer

I hope you find the answer to your question, because I am also facing the same problem.

In the meantime, use this workaround. In the scala spark shell, enter:

  • import org.apache.spark.SparkContext
  • val sc = SparkContext.getOrCreate ()

Then you have access to sc.

+11
source

Source: https://habr.com/ru/post/1014604/


All Articles