Event Sourcing with Kinesis - Play and Save

I am trying to implement an event driven architecture using Amazon Kinesis as the central event log of the platform. The idea largely coincides with the Nordstrom idea with the Hello-Retail project .

I already did similar things with Apache Kafka before, but Kinesis seems to be a cost-effective alternative to Kafka, and I decided to give it a shot. However, I encounter some issues related to saving and repeating events. I have two questions:

  • Do you guys use Kinesis for such a use case, or do you recommend using it?
  • Since Kinesis cannot save events forever (for example, Kafka), how to handle retries from consumers?

I am currently using the lambda function (Firehose is also an option) to save all events in Amazon S3. Then you could read past events from the repository, and then start listening to new events coming from the stream. But I am not happy with this decision. Consumers cannot use Kinesis checkpoints (Kafka consumer offsets). In addition, Java KCL does not yet support AFTER_SEQUENCE_NUMBER , which would be useful in such an implementation.

+4
source share
1 answer

. , Kinesis, / S3. Kinesis Firehose.

. Kinesis Streams . , . ....

S3, Kinesis Firehose S3. , .. , S3, . Amazon Athena Amazon Redshift.

, . , / , Kinesis Firehose. Kinesis.

, Kinesis, - , , . Kinesis Firehose , S3, .

S3 . S3 , , . , , .

-1

Source: https://habr.com/ru/post/1690605/


All Articles