Data ingestion represents a mechanism in which data is moved from a specific type of source to destination storage, where it can be further used for advanced analytics. Where there are very large data volumes, data is generally streamed to the destination storage, but only on the condition that the source and destination systems are capable of handling continuous streams of data. Stream data ingestion can be of one of two types: one is event-based and another one uses message queues.
Flume is a highly available distributed system that is used for streaming data ingestion. It collects, aggregates, and processes streaming data on the fly and stores it on disk for reliability. The following diagram shows the flume architecture:
The preceding diagram represents the architecture components of the flume components, the details of which are mentioned in the following points:
- Flume Sources: These grasp events from external sources and pass them...