Elastic Stack — A bird’s eye view

An overview of the elastic stack for beginners

vikas yadav
DevOps Dudes
4 min readAug 30, 2020

--

Photo by neostalgic on Unsplash

Elastic stack has been growing in popularity in recent days as a platform of choice for various data analytics use cases including SIEM(Security information and event management).

It has multiple components that work together to provide search, analytics and visualization capabilities for your data.

In this article, I will give you a birds-eye overview of the elastic stack in under 5 minutes!

Core Components

Elastic stack was previously called ELK stack (some people might still use this acronym) where E stands for Elastic search, L stands for Logstash and K stands for Kibana, each of these core components have a specific task to perform in the elk or elastic stack so let’s have a look at what each one does

  • Elasticsearch — Elasticsearch is a search and analytics engine that is built on Apache Lucene and is written in Java. It is distributed in nature and interactions with elastic search cluster are done using RestFul APIs.
  • Log Stash — Log stash is server-side data processing pipeline which can be be used to transform the data before it is ingested by Elasticsearch.
  • Kibana — Kibana is the user interface of choice for elastic search. You can administer your elastic search cluster and create awesome visualisations/dashboards using Kibana.
elasticsearch logo
elasticsearch

For a production cluster, you’ll need a little more features than those provided by these core components so let's have a look

X-Pack

X-Pack is an Elastic Stack extension that provides capabilities such as security, alerting, monitoring etc. One basic use case for X-pack is to secure your elasticsearch cluster using basic username/password authentication.

By default, X-Pack is installed when you install elastic search in the newer versions of elasticsearch so you don’t need to worry about installing X-pack anymore, however, you still have to enable and configure x-pack for your elastic cluster.

Elastic Beats

ELK stack became elastic stack with the addition of beats.

Beats are lightweight agents or shippers that are installed on the target machine from where they can forward or ship data to either elastic search or log stash(if you need to parse and enhance your data).

Now, a picture says a thousand words so let’s have a look at a sample diagram

Here, we have metricbeat that is installed on a target apache server, this beats instances once installed can be configured to collect raw metric data from the server and can be either shipped to logstash if you want to parse and enhance this data or shipped directly to elasticsearch. Once data is stored inside elasticsearch, you can use Kibana to create cool visualizations and dashboards such the one below on top of this data

Sample Kibana dashboard for metricbeat

There are multiple beats available at the time of writing this article so let’s have a look at each

  • Filebeat is used to forward log data such as system logs to your elasticsearch cluster. You can specify a file or a path to monitor for example /var/log directory on a Linux system and file beat will collect logs from this location and then ship them to either elastic search or log stash.
  • Auditbeat is used to audit user and process activities for your target systems, for example, you can use audit beat to watch changes made to a file or collect events from Linux auditd framework and then feed them to either elastic search or log stash.
  • Metricbeat is a lightweight agent that can be installed on target servers to periodically collect metric data from your target servers, this could be operating system metrics such as CPU or memory or data related to services running on the server. It can also be used to monitor other beats and ELK stack itself.
  • Heartbeat is used to do a period health check of the status of services and can be used for uptime monitoring, for example, it can be used to check if your website is up and running.
  • Packetbeat is used to get visibility of network traffic between your servers and is essentially a real-time network packet analyser. It can be used to analyse the performance of your applications.
  • Journalbeat is used to monitor journald logs, it allows for complex filtering of logs that reduces the amount of log data that is published.
  • Winlogbeat is a beats agents installed on windows servers and ships windows event logs. It can be installed as a windows service and can publish logs to either elastic search or log stash.
  • Functionbeat is used for functions as a service environment, it collects data from your serverless cloud environments and ships it to elastic stack allowing the elastic stack to monitor serverless environments.

So, there you have it a bird’s eye view of elastic stack in under 5 mins.

Please feel free to check out part 2 of this series where we’ll install single-node elasticsearch cluster on Linux.

If you want to do some hands-on practice on Elaticsearch, here is the link to my youtube playlist where I show you how to can set up a lab setup with 2 apache servers feeding data to a single-node Elasticsearch cluster deployed on google cloud platform or GCP.

--

--

vikas yadav
DevOps Dudes

IT engineer with 14 years of experience in IT with recent experience in Solution design, Big data, and log analytics.