I am using Scala Version 2.10.5 Cassandra 3.0 and Spark 1.6. I want to insert data into cassandra, so I tried the Original example
scala> val collection = sc.parallelize(Seq(("cat", 30), ("fox", 40))) scala> collection.saveToCassandra("test", "words", SomeColumns("word", "count"))
What works and the ability to insert data into Cassandra.So I had a csv file that I wanted to insert into the Cassandra table, matching the schema
val person = sc.textFile("hdfs://localhost:9000/user/hduser/person") import org.apache.spark.sql._ val schema = StructType(Array(StructField("firstName",StringType,true),StructField("lastName",StringType,true),StructField("age",IntegerType,true))) val rowRDD = person.map(_.split(",")).map(p => org.apache.spark.sql.Row(p(0),p(1),p(2).toInt)) val personSchemaRDD = sqlContext.applySchema(rowRDD, schema) personSchemaRDD.saveToCassandra
When I use the SaveToCassndra Iam, getting saveToCassandra is not part of personSchemaRDD. So taught in different ways.
df.write.format("org.apache.spark.sql.cassandra").options(Map( "table" -> "words_copy", "keyspace" -> "test")).save()
But receiving cannot connect to cassandra on ip: port.can someone will tell me the best way to do this. I need to periodically save data in cassandra from files.