ELK
ELK (shorthand for ElasticSearch-Logstash-Kibana) is a set of independent tools that work in tandem to collect Data from various sources and transfer them to Elastic Search. Kibana is then used to build a visualization of top of the indexed data.
ELK started as a distributed, highly-scalable system for aggregating Logs, but has since evolved to collect a variety of data.
Collecting data with ELK usually consists of the following phases:
- Shipping the data from the host system to a set of collection agents
- Agrregating the data from different sources at the collection phase and queuing them for resilience requirements
- Processing the aggregated data by taking batches of collected data from the queue and indexing it in Elastic Search:
Different systems/libraries are responsible for handling different phases of the complete life cycle.
Let's look at these systems and libraries in a bit more detail.
Beats
Beats are lightweight, single-purpose data-shipper agents. They are installed on the machines where...