Kafka JSON Series Kit

I'm new to Kafka, Serialization and JSON

WHAT I want the producer to send the JSON file through kafka and the consumer in order to consume and work with the JSON file in its original form.

I was able to get this, so JSON is a converter to string and sent via String Serializer, and then the consumer will parse the String and recreate the JSON object, but I'm worried that this is inefficient or the correct method (may lose field types for JSON)

So, I decided to create a JSON serializer and install it in my manufacturer configurations.

I used JsonEncoder here: Kafka: custom serializer entry

But when I try to run my producer now, it seems that in the toBytes function of the encoder, the try block never returns anything as I want it to

try { bytes = objectMapper.writeValueAsString(object).getBytes(); } catch (JsonProcessingException e) { logger.error(String.format("Json processing failed for object: %s", object.getClass().getName()), e); } 

It seems objectMapper.writeValueAsString(object).getBytes() ; takes my JSON obj ( {"name":"Kate","age":25} ) and converts it to nothing,

this is my launch launcher function

 List<KeyedMessage<String,JSONObject>> msgList=new ArrayList<KeyedMessage<String,JSONObject>>(); JSONObject record = new JSONObject(); record.put("name", "Kate"); record.put("age", 25); msgList.add(new KeyedMessage<String, JSONObject>(topic, record)); producer.send(msgList); 

What am I missing? Will my original method (convert to string and send and then restore JSON obj), ok? or just not the right way?

THANKS!

+5
source share
3 answers

Hmm, why are you afraid that the serialize / deserialize step will lead to data loss?

One option is to use the Kafka JSON serializer, which is included in the Conflict Schema Registry , which is free and open source (disclaimer: I work in Confluent). Its test suite contains several examples to help you get started, and further details are described in serializers and formatting . The advantage of this JSON serializer and the schema registry itself is that they provide transparent integration with manufacturers and consumers for Kafka. Besides JSON, Apache Avro is also supported there if you need it.

IMHO, this setting is one of the best options in terms of developer convenience and ease of use when talking to Kafka in JSON - but, of course, YMMV!

+3
source

I would suggest converting an event string that is JSON for a byte array, for example:

byte [] eventBody = event.getBody ();

This will increase your productivity, and Kafka Consumer will also provide a JSON parser that helps you get your JSON back.
Please let me know if any further information is needed.

+1
source

Source: https://habr.com/ru/post/1232919/


All Articles