Collect logs with ELK stack

How to setup logging infrastructure

Dmytro Misik
5 min readFeb 2, 2023
ELK

Logging is a great instrument for ‘debugging’ applications on production. If you find out an issue — with good application-level logs you can find out the reason effortlessly. In my opinion, the ELK stack is one of the best solutions to collect logs.

ELK is an acronym that stands for Elasticsearch-Kibana-Logstash. Elasticsearch is a search and analytics engine. Kibana is a data visualization instrument. Logstash is a data processing pipeline. In this acronym, one letter is lost — ‘B’. It stands for Beats — a family of single-purpose data shippers.

In this post, I want to show how to create ELK infrastructure with Docker and set up it to collect logs from the Go application.

Application

As I said before, I’m going to create a simple Go application (console) that writes logs. For logging I’m going to use zap — it’s an open-source library for structured logging in Go. To format logs, I’m going to use ecszap — a library that provides a formatter that helps create Elastic Common Schema (ECS) conformant log entries.

The application will be very simple:

package main

import (
"os"
"time"

"go.elastic.co/ecszap"
"go.uber.org/zap"
)

func main() {
encoderConfig := ecszap.NewDefaultEncoderConfig()
core := ecszap.NewCore(encoderConfig, os.Stdout, zap.DebugLevel)
logger := zap.New(core, zap.AddCaller()).With(zap.String("app", "go-elk")).With(zap.String("environment", "local"))

var i int
for {
logger.Info("application log",
zap.Int("times", i),
)

i++
time.Sleep(5 * time.Second)
}
}

It will just log every five seconds same log with single field times to the console. If you run the application, you will see next:

{"log.level":"info","@timestamp":"2023-02-01T17:38:33.664+0200","log.origin":{"file.name":"go-elk/main.go","file.line":18},"message":"application log","app":"go-elk","environment":"local","times":0,"ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2023-02-01T17:38:38.668+0200","log.origin":{"file.name":"go-elk/main.go","file.line":18},"message":"application log","app":"go-elk","environment":"local","times":1,"ecs.version":"1.6.0"}
...

A lot of information you can find here: source file, code line, timestamp, log level, etc. Let’s create Dockerfile to run this application:

FROM golang:1.19

WORKDIR /app

COPY go.mod go.sum ./
RUN go mod download

COPY . .
RUN go build -o /out/app ./main.go

CMD ["/out/app"]

I want to start several services with Docker. For this purpose, I’m going to use docker-compose. So let’s define the current application in the docker-compose.yaml file:

  service:
build:
context: .
dockerfile: Dockerfile

Let’s move forward to the ELK stack.

Elasticsearch

Elasticsearch will be used to store logs and query them.

To start Elasticsearch in Docker, all you need to do is add it to the docker-compose.yaml:

  elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.6.1
volumes:
- elasticsearch:/usr/share/elasticsearch/data
environment:
discovery.type: single-node
xpack.security.enabled: false
ports:
- 9200:9200
- 9300:9300
restart: unless-stopped

volumes:
elasticsearch:

It will be a single-node cluster with disabled security (for production purposes do not use a single node and enable security).

Kibana

Kibana will be used to visualize logs.

Again the same steps:

  kibana:
image: docker.elastic.co/kibana/kibana:8.6.1
environment:
ELASTICSEARCH_HOSTS: '["http://elasticsearch:9200"]'
ports:
- 5601:5601
depends_on:
- elasticsearch
restart: unless-stopped

Kibana is dependent upon Elasticsearch.

Logstash

Logstash will be used to collect logs from the beats, transform them and send them to Elasticsearch.

Here you need to provide configuration with logstash.conf:

input {
beats {
port => 5044
}
}

filter {
json {
source => "message"
}
}

output {
elasticsearch {
hosts => "elasticsearch:9200"
data_stream => "true"
}
}

Logstash, in this case, ingests data from beats, uses a built-in json filter to parse property message to JSON and sends transformed data to Elasticsearch.

Then this file should be mounted into a Docker container. It can be done with docker-compose.yaml:

  logstash:
image: docker.elastic.co/logstash/logstash:8.6.1
volumes:
- ./logstash/logstash.conf:/usr/share/logstash/pipeline/logstash.conf:ro
ports:
- 5044:5044
depends_on:
- elasticsearch
restart: unless-stopped

Filebeat

Filebeat will be used to collect logs from the application.

Filebeat can collect logs from different sources. In my case, I’m going to use Docker autodiscovery to collect logs from Docker containers. As for the Logstash, you need to provide the configuration filebeat.yaml:

filebeat.autodiscover:
providers:
- type: docker
hints.enabled: true
hints.default_config.enabled: false

output.logstash:
hosts: ["logstash:5044"]

Filebeat is going to use hints-based autodiscovery. It will look only on Docker containers with the label co.elastic.logs/enabled: true. So to collect logs from the service container you need to add this label. Filebeat will send collected logs to Logstash.

The very last step is to add Filebeat to the docker-compose.yaml:

  filebeat:
image: docker.elastic.co/beats/filebeat:8.6.1
user: root
volumes:
- ./filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml:ro
- /var/lib/docker/containers:/var/lib/docker/containers:ro
- /var/run/docker.sock:/var/run/docker.sock:ro
depends_on:
- logstash
command: filebeat -e -strict.perms=false
restart: unless-stopped

Source code

All code you can find in the next repository:

Testing

To run the application and ELK services you need to execute the next command in the root directory:

docker compose up -d

In a few minutes (or seconds, depending on your machine), you can open Kibana using this link. In Dev Tools you can find the index created by Logstash:

Index

You can now create a data view for logs. Go to Discover -> Create data view -> Choose the Name -> Enter logs* in the Index pattern -> Save data view to Kibana:

Kibana

Now you will see a pretty table with your logs:

Logs

If you are familiar with Kibana, you can toggle columns in the table, filter data and do other stuff to analyze the logs.

Conclusion

As you can see, it’s very easy to send your application logs to the ELK stack that will analyze, transform and filter them. With Elasticsearch you can query for logs by any criteria such as log level, timestamp, message pattern, etc. With Kibana you can visualize data and analyze it to find issues.

Resources

--

--

Dmytro Misik

🇺🇦 Engineering Manager 👨‍💻 at DraftKings. Support me with 'Buy Me A Coffee': https://www.buymeacoffee.com/dmytromisik