I am trying to connect the cassandra DataStax spark plug to it. I created a new SBT project in IntelliJ and added one class. The class and my sbt file are given below. However, creating a spark context seems to work when I uncomment the line where I am trying to create cassandraTable, I get the following compilation error:
Error: scalac: invalid symbolic link. The signature in CassandraRow.class refers to the thermal catalyst in the org.apache.spark.sql package, which is not available. This may be completely absent in the current class path, or the version of the class path may not be compatible with the version used to compile the CassandraRow.class class.
Sbt is new to me, and I would appreciate any help in understanding what this error means (and, of course, how to resolve it).
name := "cassySpark1" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0" libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "1.1.0" withSources() withJavadoc() libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector-java" % "1.1.0-alpha2" withSources() withJavadoc() resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
And my class:
import org.apache.spark. {SparkConf, SparkContext}
import com.datastax.spark.connector._
HelloWorld object {def main (args: Array [String]): Unit = {System.setProperty ("spark.cassandra.query.retry.count", "1")
val conf = new SparkConf(true) .set("spark.cassandra.connection.host", "cassandra-hostname") .set("spark.cassandra.username", "cassandra") .set("spark.cassandra.password", "cassandra") val sc = new SparkContext("local", "testingCassy", conf)
// val foo = sc.cassandraTable ("key name", "table name")
val rdd = sc.parallelize(1 to 100) val sum = rdd.reduce(_+_) println(sum) } }