Using flux: reading from HTTP and clicking on HDFS via Kafka

I am new to Flume and am thinking of using Flume in the scenario below.

Our system accepts events as HTTP POST, and we need to save a copy of them in Kafka (for further processing) and another copy in HDFS (as a permanent store).

Can we configure the stream source as HTTP, the channel as KAFKA, as HDFS to meet our requirements. Will this solution work?

+4
source share
1 answer

, , Kafka , , Flume , . , Flume , , Flume, , - ( , ). , , Kafka , , HTTP- HDFS-; .

, , :

http_source -----> memory_channel -----> HDFS_sink ------> HDFS
            |
            |----> memory_channel -----> Kafka_sink -----> Kafka

{.................Flume agent.....................}       {backend}

, , , Kafka, Kafka Kafka, , .

0

Source: https://habr.com/ru/post/1608121/


All Articles