Visualize Traefik logs in Kibana

It’s really easy! (when you know how)

Xabi
The Sysadmin
3 min readJul 6, 2018

--

You’re probably here because you’ve decided to follow the trend of centralized logs. So let’s go straight to the point: do you use Traefik as reverse proxy? In this guide we will configure the ELK stack to collect our Traefik logs!

Random log image, found here

First of all, a review of concepts:

  • Traefik: a “cloud native” reverse proxy / load balancer
  • Filebeat: a lightweight shipper for logs
  • Logstash (optional): a server-side data processing pipeline that ingests data from a multitude of sources
  • ElasticSearch: search and analytics engine, where we store our logs
  • Kibana: visualizing platform for the Elasticsearch data

I’m going to assume you already have your Elastic stack installed and ready to use, so you only need some configuration files to achieve our goal! (am I wrong? then maybe you should follow this guide).

Configuring Traefik log

Traefik v1.5 added the possibility to use JSON format for the logs.

That’s great for us, as it makes the log processing much easier!

You only need to add this section in your “traefik.yml” file:

Configuring Filebeat

Now we need to send the logs to Logstash. The easiest way to do that is using Filebeat.

After following the installation instructions, we need to configure it. Most of the examples out there explain how to send a simple plain-text log file, but we have a JSON file! Shouldn’t it be easier?

Once again, it’s not really difficult when you know how…

The configuration above will send the processed JSON data to Logstash, so you don’t even need to apply any filter.

You may define your own tags (they will help you to process the logs later) and edit your logstash host address. Finally, if you have the SSL enabled in your Logstash, don’t forget to copy the certificate and specify the path.

Note: in Filebeat 6.3 they renamed “prospections” to “inputs”. More info

Configuring Logstash to ingest the data (optional)

We’re almost done. Now we need to configure Logstash to ingest our Filebeat data. If you don’t have any Beats component enabled yet, you can use the following configuration file as template:

As you can see, it stores our beats (files, metrics, etc) in the ElasticSearch service.

Of course, you can skip this step, sending your JSON directly from Filebeat to Elastic.

Kibana

Let’s check if we obtain all the processed fields from Traefik in Kibana!

Great! So now we can finally draw a pie chart!!! :D

Data scientists favourite chart
Traefik’s “gopher” loves potato and cheese

--

--