Centralize Your Logs using Fluentd, Kibana and ElasticSearch

Hiten Pratap Singh
TechCret Software
Published in
4 min readJun 30, 2020

Today, most of the applications develop using microservice architecture so it become very hard to manage and analyse the logs from these different applications. It also becomes necessary to have some kind of central place to store these logs as logs doesn’t generated from applications alone rather, we have to deal with system logs, server logs and database logs etc.

Fluentd

So, to deal with this issue, we have to develop some kind of centralised logging system to collect these logs from different sources and then it becomes pretty easy to analyse them.

Centralized Log Management is an approach where the logs that are generated from sub-systems within the same environment are collected, parsed and stored in a central repository in an organized fashion thereby reducing the overall management effort to identify an issue. There are various open-source as well as paid tools available in the market to accomplish this. Some of the examples are Splunk, LogRythm, LogPacker, Logstash, Fluentd, etc.

I have already published a blog on how to achieve the said objective using LogStash and it can be read here:

So, let's start with how to achieve the said objective using Fluentd.

LogStash is a part of the popular ELK stack provided by elastic while Fluent is a part of Cloud Native Computing Foundation (CNCF). Both are open-source data processing pipeline that can be used for collecting, parsing, and storing logs from different sources. These logs can either be indexed in Elastic search or can be pushed to storage. There are number of plugins available as well to do things more efficiently.

  1. We must have ElasticSearch and Kibana installed on our system and the guide to install these are here:

2. We have to install Fluentd in our system and it doesn’t matter which OS is being used as it’s up and running on most of those.

3. Next, we have to edit the configuration file for Fluentd. There are following mentioned components used to configure Fluentd.

  1. source directives determine the input sources.
  2. match directives determine the output destinations.
  3. filter directives determine the event processing pipelines. (optional)
  4. system directives set system wide configuration. (optional)
  5. label directives group the output and filter for internal
    routing (optional)
  6. @include directives include other files. (optional)

There’s a configuration file named td-agent.conf so we will edit this file to configure Fluentd as per our need. This file can be found inside the /etc/td-agent/ folder.

I used a sample spring boot application to generate logs so the configuration for that would look like below:

After this step, make sure that Kibana and ElasticSearch is running and then restart the td-agent service. After restarting the service, you can check the logs for td-agent inside the file /var/log/td-agent/td-agent.log and would see something like this:

Fluentd Logs

After all these steps, it’s time to open up Kibana dashboard and there you can see the index by the name mentioned in the td-agent config file. After setting up the index in the Kibana then you will start to see your application logs there immediately.

Kibana Dashboard

Any kind of source can be used for logs with Fluentd. We just need to make sure to write the format properly as per our logs syntax in the td-agent config file.

That’s all for this tutorial. Next, I’ll write about comparison between Logstash and Fluentd and some more blogs about how to ships logs from inside of Docker containers. So, stay tuned for more resourceful blogs.

--

--