How to set blob column in where section using spark-connector-api?

I am trying to figure out how to set the blob column in the where section. any idea?

For example, if I put the following query in cqlsh, it works

select * from hello where id=0xc1c1795a0b;

// id - blob column in cassandra

I tried the following

JavaRDD<CassandraRow> cassandraRowsRDD = javaFunctions(sc).cassandraTable("test", "hello")
.select("range" )
.where("id=?", "0xc1c1795a0b");

This gave me a conversion type exception

and i tried this

JavaRDD<CassandraRow> cassandraRowsRDD = javaFunctions(sc).cassandraTable("test", "hello")
.select("range" )
.where("id=?", "0xc1c1795a0b".getBytes());

This did not give me any errors, but there were no results. The query in my cqlsh did return a bunch of results. so i'm not sure to install blob in the where section. I am using Java. any ideas?

+4
source share
1 answer

Use this.

import com.datastax.driver.core.utils.Bytes;

JavaRDD<CassandraRow> cassandraRowsRDD = javaFunctions(sc).cassandraTable("test", "hello")
.select("range" )
.where("id=?",Bytes.getArray(Bytes.fromHexString("0xc1c1795a0b")));
+2
source

Source: https://habr.com/ru/post/1657219/


All Articles