ECS Container Logs stream to ElasticSearch using Fluentd (EFK)

KTree
4 min readApr 10, 2020

--

Stream all your container logs with EFK ( Elasticsearch + Fluentd + Kibana)

In this article, We will see how we can configure Fluentd to push Docker container logs to Elasticsearch. As we proceed, We will implement a logging system for docker containers

Contents :

  1. Introduction to Fluentd
  2. Introduction to Elasticsearch and Kibana
  3. Docker logging drivers
  4. Logs transfer flow
  5. Prerequisites
  6. Deployment Steps

Introduction to Fluentd

Fluentd is an open-source application first developed as a big data tool. It is capable of collecting data from multiple sources and provides an easy way to access and analyze. Due to its data collecting capabilities, major service providers like Amazon Web Services, Google Cloud Platform, etc. started providing support for this. The official site calls Fluentd as a unified logging layer as it can collect logs from multiple sources.

Fluentd is written primarily in Ruby with performance-sensitive parts written in C. To overcome difficulties installing and managing ruby, the original creators, Treasure Data, Inc started providing a stable community distribution of Fluentd, called td-agent. The differences between Fluentd and td-agent can be found here.

Introduction to Elasticsearch and Kibana

Elasticsearch is a service capable of storing, searching and analyzing large amounts of data. It can act as a database as the data is stored in the form of index, document, and field. It can also be a search engine as it searches and analyses the data using filters and patterns. It is an open-source tool built on Apache Lucene. every feature of Elastic search is available as a REST API.

Kibana is an open-source data visualization plugin available for Elasticsearch It provides a Web UI with easy to use filters and dashboards to access data available in Elasticsearch. The dashboards and filters are highly customizable and can be created as we want.

Docker logging drivers

Since all applications in the Docker container run in an isolated environment, we need a separate mechanism to access the logs. To overcome this, Docker supports multiple logging mechanisms to collect and handle logs from multiple containers. These mechanisms are also called logging drivers. We can set a default driver for each docker service. Docker also provides a way to specify log drivers at the container level. In addition to using the logging drivers included with Docker, you can also implement and use logging driver plugins.

In this article, we are going to use Fluentd as logging driver for all containers. Below log transfer flow presents an overview of how our final deployment works

Logs transfer flow :

Prerequisites

  • Elasticsearch cluster or instance with Kibana installed.
  • Docker server with running Docker containers or ECS cluster containers

Deployment Steps

Below are all the steps needed to implement the logging driver and start pushing logs to Elasticsearch

Step: 1 — Installing Fluentd on docker instance

Fluentd is available in different application packages like rpm, deb, exe, msi, etc. to install. In our case, we are using Amazon Linux 2 for testing. So, the below command will be useful to install fluentd.

# curl -L https://toolbelt.treasuredata.com/sh/install-amazon2-td-agent3.sh | sh

Step: 2 — Configure the Fluentd to send logs to ES

Fluentd configuration file located at /etc/td-agent/td-agent.conf. Fluentd can define multiple sources and destinations to collect and send data. Each source is defined in <source> .. </source> tags and each destination is defined in <match> … </match> tags. Multiple sources and destination pairs can be defined in a single configuration file.

By default, any TCP/UNIX port can be used as a source of the logs. We can also use files as sources. We can increase its flexibility by installing fluent-plugins available as ruby gems. These plugins provide a large number of source and destination configurations

The below configuration will make td-agent service listen for logs in 0.0.0.0:24224 TCP port and sends the docker container logs to elasticsearch.

<source>

@type forward

port 24224

bind 0.0.0.0

</source>

<match *.*>

@type stdout

</match>

<match docker.**>

@type elasticsearch

logstash_format true

host <ES_NODE_IP/URL>

port 9200

flush_interval 3s

</match>

Once you save the config file restart the td-agent service.

# sudo systemctl restart td-agent

Step: 3 — Run the docker container by specifying fluentd as a log driver as shown below command.

# docker run -d — name container1 — log-driver=fluentd — log-opt tag=”docker.{.ID}}” ubuntu /bin/echo ‘Hello world’

Step:4 — Now, to check the logs, we can access the kibana dashboard to filter our logs.

Follow the below links for more info :

--

--