Why does the Spark Cassandra Connector fail with a NoHostAvailableException?

I'm having trouble connecting the Spark Cassandra Connector to Scala.

I use the following versions:

  • Scala 2.10.4
  • spark-core 1.0.2
  • cassandra-thrift 2.1.0 (mine installed cassandra v2.1.0)
  • cassandra-clientutil 2.1.0
  • cassandra-driver-core 2.0.4 (recommended for connector?)
  • spark-cassandra-connector 1.0.0

I can connect and talk with Cassandra (without a spark), and I can talk with Spark (without Cassandra), but the connector gives me:

com.datastax.driver.core.exceptions.NoHostAvailableException: all hosts (s) tried for the request did not work (tried: /10.0.0.194: 9042 (com.datastax.driver.core.TransportException: [/10.0.0.194: 9042] Cannot connect))

What am I missing? Cassandra is installed by default (port 9042 for cql according to cassandra.yaml). I am trying to connect locally ("local").

My code is:

val conf = new SparkConf().setAppName("Simple Application").setMaster("local") val sc = new SparkContext("local","test",conf) val rdd = sc.cassandraTable("myks","users") val rr = rdd.first println(s"Result: $rr") 
+5
source share
1 answer

Local in this context indicates the Spark wizard (says that it starts in local mode), and not the Cassandra connection node.

To set up the Cassandra Connection host, you must set another property in Spark Config

 import org.apache.spark._ val conf = new SparkConf(true) .set("spark.cassandra.connection.host", "IP Cassandra Is Listening On") .set("spark.cassandra.username", "cassandra") //Optional .set("spark.cassandra.password", "cassandra") //Optional val sc = new SparkContext("spark://Spark Master IP:7077", "test", conf) 

https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md

+4
source

Source: https://habr.com/ru/post/1203013/


All Articles