Lots of constructors with the same number of exception parameters when converting data in spark mode using scala

Below is the code

def findUniqueGroupInMetadata(sc: SparkContext): Unit = { val merchantGroup = sc.cassandraTable("local_pb", "merchant_metadata").select("group_name") try { val filterByWithGroup = merchantGroup.filter { row => row.getStringOption("group_name") match { case Some(s: String) if (s != null) => true case None => false } }.map(row => row.getStringOption("group_name").get.capitalize) //filterByWithGroup.take(15).foreach(data => println("merchantGroup => " + data)) filterByWithGroup.saveToCassandra("local_pb", "merchant_group", SomeColumns("group_name")) } catch { case e: Exception => println(e.printStackTrace()) } 

}

Exception =>

 java.lang.IllegalArgumentException: Multiple constructors with the same number of parameters not allowed. at com.datastax.spark.connector.util.Reflect$.methodSymbol(Reflect.scala:16) at com.datastax.spark.connector.util.ReflectionUtil$.constructorParams(ReflectionUtil.scala:63) at com.datastax.spark.connector.mapper.DefaultColumnMapper.<init>(DefaultColumnMapper.scala:45) at com.datastax.spark.connector.mapper.LowPriorityColumnMapper$class.defaultColumnMapper(ColumnMapper.scala:47) at com.datastax.spark.connector.mapper.ColumnMapper$.defaultColumnMapper(ColumnMapper.scala:51) 
+5
source share
1 answer

I found the answer after watching some blogs.

When I converted RDD [String] to RDD [Tuple1 [String]], everything went smoothly. Therefore, basically, to save data in Cassandra, the data should be of the RDD type [TupleX [String]] here x may be 1,2,3 ... or the data may be RDD [SomeCaseClass]

+4
source

Source: https://habr.com/ru/post/1260962/


All Articles