Apache Flink: reading data from Kafka as an array of bytes

How can I read data from Kafka in format byte[]?

I have an implementation that reads events as Stringwith SimpleStringSchema(), but I could not find a scheme for reading data as byte[].

Here is my code:

    Properties properties = new Properties();
    properties.setProperty("bootstrap.servers", "kafka1:9092");
    properties.setProperty("zookeeper.connect", "zookeeper1:2181");
    properties.setProperty("group.id", "test");
    properties.setProperty("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer");
    properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.ByteArrayDeserializer");
    properties.setProperty("auto.offset.reset", "earliest");
    DataStream<byte[]> stream = env
                .addSource(new FlinkKafkaConsumer010<byte[]>("testStr", ? ,properties));
+4
source share
1 answer

Finally, I found that:

DataStream<byte[]> stream = env
            .addSource(new FlinkKafkaConsumer010<>("testStr", new AbstractDeserializationSchema<byte[]>() {
                @Override
                public byte[] deserialize(byte[] bytes) throws IOException {
                    return bytes;
                }
            }, properties));
+1
source

Source: https://habr.com/ru/post/1689849/


All Articles