ELK Stack Post — 2

Deploying ELK Stack for Apache Logs Analysis

Arun Kumar Singh
TechBull
Published in
3 min readApr 24, 2020

--

This is the next set of task in ELK series. Post-1 demonstrated deploying ELK Stack for apache logs analysis using Filebeat, Elastic and Kibana.

In this we will follow different strategy. We will use Logstash to format data.

Filebeat → Logstash → Elastic ← Kibana

You can use Filebeat modules with Logstash directly, but you need to do some extra setup. The simplest approach is to set up and use the ingest pipelines provided by Filebeat. If the ingest pipelines don’t meet your requirements, you can create Logstash configurations to use instead of the ingest pipelines. The Logstash pipeline configuration in this example shows how to ship and parse access and error logs collected by the apache Filebeat module.

Steps what need to be done in this case -

  1. Update Filebeat config file so that it can route logs to Logstash

Once done run filebeat agent.

2. Update Logstash to route the logs to Elasticsearch

# The --config.reload.automatic option enables automatic config reloading so that you don’t have to stop and restart Logstash every time you modify the configuration file.logstash -f config/pipeline.conf --config.reload.automatic# running logstash
logstash -f config/pipelines.conf
####################Please note the pipelines.yml file is used for running multiple pipelines in a single Logstash instance. In our experiment we are running one instance so need to configure

The grok filter plugin is one of several plugins that are available by default in Logstash. The grok filter plugin enables you to parse the unstructured log data into something structured and queryable. the grok filter plugin looks for patterns in the incoming log data, configuring the plugin requires you to make decisions about how to identify the patterns that are of interest to your use case. More details: https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html#configuring-grok-filter

However, if you need to force Filebeat to read the log file from scratch. To do this, go to the terminal window where Filebeat is running and press Ctrl+C to shut down Filebeat. Then delete the Filebeat registry folder.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

We are good to validate the data in Kibana now. To access this data you need to create index pattern.

When Index Pattern is ready, go to Discover section of Kibana and thats it. You can see it has listed the index pattern. Here you can perform operations on data.

Filtering data

Happy Learning !

--

--