How to set up ELK for Rails log management using Docker and Docker Compose

by Dmitry Tsepelev

AnjLab
7 min readAug 31, 2016

Logging is one of the essential parts in any setup because logs are extremely useful when it comes to troubleshooting problems happening with your production app. When you decide to move your apps to the docker-managed environments you’ll realise that some of the approaches you were using don’t work anymore — now your logs are staying inside the container and you need to find a way to make them persistent. It goes without saying that the easiest way is to put your logs to the volume but what if you have many instances of the app? In this case you have to use a centralised storage for logs.

After implementing docker-based deployment process for one of our apps we’ve decided to store our logs in one place using the ELK stack. ELK is a centralised log server and log management web interface which includes Elasticsearch (as a storage and a search engine), Logstash (a tool for log processing), and Kibana (UI for the Elasticsearch).

Let’s take this article as a base for our setup and but use Docker Compose 2 and set up our log to contain custom fields. There’s also popular Docker image sebp/elk, but it does not support Graylog Extended Log Format (GELF) out of the box.

GELF is a log format that avoids the shortcomings of classic plain syslog, like a limit to length of 1024 bytes, no data types in structured syslog etc. This is an example GELF message payload:

{
"version": "1.1",
"host": "example.org",
"short_message": "A short message",
"full_message": "Backtrace here\n\nmore stuff",
"timestamp": 1385053862.3072,
"level": 1,
"_user_id": 9001,
"_some_info": "foo",
"_some_env_var": "bar"
}

You will need a configured Rails environment, Docker and Docker Compose on your machine to get started.

Putting you Rails app into Docker container

To get the ball rolling we need to have a working Rails app, you can use any existing app you have or create a new one:

$ rails new -d postgresql app

In order to support Docker environment you should change your development environment in app/config/database.yml (don’t forget about other environments if you decide to host your app elsewhere!):

development: &default
adapter: postgresql
encoding: unicode
database: postgres
username: postgres
host: db

We also need to set up JS runtime for our app — just find the following line in your Gemfile and uncomment it:

gem ‘therubyracer’, platforms: :ruby

Add Dockerfile to the app folder with the following content:

FROM ruby:2.3

ENV RAILS_ENV development

WORKDIR /app
ADD Gemfile Gemfile.lock /app/

RUN bundle install -j5 --retry 10

ADD . /app

Create a directory called db and add Dockerfile for postgres container:

FROM postgres

ENV POSTGRES_USER 'postgres'
ENV POSTGRES_DB 'app_development'

When you set up these environment variables the postgres image will create a database for you.

Create file docker-compose.yml in the project root directory, please note that we are using the second version of config file:

version: '2'

services:
web:
build:
context: ./app
dockerfile: Dockerfile
environment:
RAILS_ENV: development
ports:
- '3000:3000'
command: rails s -b 0.0.0.0

db:
build:
context: ./db
dockerfile: Dockerfile

Now we can build and run our containers using the following commands:

$ docker-compose build
$ docker-compose up

As a result you’ll be able to see your app running on http://localhost:3000 (if you’re not using native Docker yet — replace localhost with your docker machine’s IP).

Setting up ELK stack

Create a folder called logstash in the root directory and put the Dockerfile with the following content into it:

FROM logstash:latest

ADD logstash.conf /etc/logstash/conf.d/

We also should provide a config file called logstash.conf (it should be placed into the same directory as our Dockerfile). We are going to consume logs from the GELF Docker log driver, so we are adding it as our input, forwarding our logs to the elasticsearch output, and setting up a host for it:

input {
gelf {}
}

output {
elasticsearch {
hosts => "elasticsearch:9200"
}
}

Create a folder called kibana in the root directory and put the Dockerfile with the following content into it:

FROM kibana:latest

RUN apt-get update && apt-get install -y netcat

COPY entrypoint.sh /tmp/entrypoint.sh
RUN chmod +x /tmp/entrypoint.sh

ADD kibana.yml /opt/kibana/config/

CMD ["/tmp/entrypoint.sh"]

Kibana should always start after the Elasticsearch so we have to add a custom entrypoint for our Dockerfile:

#!/usr/bin/env bash

while true; do
nc -q 1 elasticsearch 9200 2>/dev/null && break
done

exec kibana

Our kibana.yml config file will look like this:

port: 5601

host: "0.0.0.0"

elasticsearch_url: "http://elasticsearch:9200"
elasticsearch_preserve_host: true
kibana_index: ".kibana"
default_app_id: "discover"
request_timeout: 300000
shard_timeout: 0
verify_ssl: true

bundled_plugin_ids:
- plugins/dashboard/index
- plugins/discover/index
- plugins/doc/index
- plugins/kibana/index
- plugins/markdown_vis/index
- plugins/metric_vis/index
- plugins/settings/index
- plugins/table_vis/index
- plugins/vis_types/index
- plugins/visualize/index

The final step is to add services with our containers to the docker-compose.yml:

logstash:
build: logstash/
command: logstash -f /etc/logstash/conf.d/logstash.conf
ports:
- "12201:12201/udp"

elasticsearch:
image: elasticsearch:latest
command: elasticsearch -Des.network.host=0.0.0.0
ports:
- "9200:9200"
- "9300:9300"

kibana:
build: kibana/
ports:
- "5601:5601"

After building and running our services we’ll be able to access Kibana on http://localhost:5601. If you want you logs to be persistant — you just need to put your Elasticsearch data folder (/usr/share/elasticsearch/data) to a volume.

Sending Rails logs to Logstash

In order to forward our container logs to logstash we should just turn on GELF driver for our web container and set up a host URL:

web:
...
logging:
driver: gelf
options:
gelf-address: 'udp://localhost:12201'

Note: why do we use localhost instead of logstash here? The code which configures logging driver will be executed on the host machine which is not part of default network created by docker.

After restarting your services you should open your Rails app, open Kibana, navigate to the settings tab and create your index:

When index is created you’ll be able to see your first log entries in the Kibana UI. However, our current logging is not super helpful since we just sending plain text from the Rails log:

Instead we should present our log entries in JSON format and put as much useful data as we can. Folks in the wild Internet use the Lograge gem, we should add two gems to our Gemfile: lograge and logstash-event. After that we should create a new initializer called lograge.rb, which will set up an appropriate format for our logs and send them to stdout where docker will grab them and send to the GELF driver:

Rails.application.configure do
config.lograge.formatter = Lograge::Formatters::Logstash.new
config.lograge.logger = ActiveSupport::Logger.new(STDOUT)
end

In order to turn on lograge for the environment (now we want it for the development.rb) — just add the following line to the corresponding environment file:

config.lograge.enabled = true

Adding custom fiels is very simple, if you need something from the controller context — just overload the append_info_to_payload method in the ApplicationController and put everything you need to the payload, and after that you’ll be able to add it to log. For instance, if you want to send an IP where the request came from, your append_info_to_payload will be:

class ApplicationController < ActionController::Base
...
def append_info_to_payload(payload)
super
payload[:request_ip] = request.ip
end
end

And lograge.rb should contain:

config.lograge.custom_options = lambda do |event|
{
request_ip: event.payload[:request_ip]
}
end

The final step is to modify our logstash.conf file to parse the incoming JSON from the message field, just add the following code after the input section:

filter {
json {
source => "message"
remove_field => "message"
}
}

After rebuilding and starting your services you’ll see that all the data coming from the Rails container is structured and indexed by Elasticsearch:

Let’s try to search for all the GET requests. You should open the Discover tab and type a query “method:GET” to the search field. If everything is fine you’ll see all the log entries with the GET HTTP method with the highlights:

Conclusion

To recap, let’s revisit our container setup. User can access our Rails app on the port 3000 and Kibana on 5601. Rails app sends logs to Logstash via 12201/udp port (GELF input endpoint) and Kibana and Logstash talk to Elasticsearch via port 9200 (JSON REST API port).

You can find a sample app on Github.

How are you setting up your logging? If you have any suggestions and ideas — welcome to comments.

--

--

AnjLab

AnjLab is a team of skilled software development professionals. We are experienced in the development of performant, scalable and reliable solutions.