How to set up ELK for Rails log management using Docker and Docker Compose
by Dmitry Tsepelev
Logging is one of the essential parts in any setup because logs are extremely useful when it comes to troubleshooting problems happening with your production app. When you decide to move your apps to the docker-managed environments you’ll realise that some of the approaches you were using don’t work anymore — now your logs are staying inside the container and you need to find a way to make them persistent. It goes without saying that the easiest way is to put your logs to the volume but what if you have many instances of the app? In this case you have to use a centralised storage for logs.
After implementing docker-based deployment process for one of our apps we’ve decided to store our logs in one place using the ELK stack. ELK is a centralised log server and log management web interface which includes Elasticsearch (as a storage and a search engine), Logstash (a tool for log processing), and Kibana (UI for the Elasticsearch).
Let’s take this article as a base for our setup and but use Docker Compose 2 and set up our log to contain custom fields. There’s also popular Docker image sebp/elk, but it does not support Graylog Extended Log Format (GELF) out of the box.
GELF is a log format that avoids the shortcomings of classic plain syslog, like a limit to length of 1024 bytes, no data types in structured syslog etc. This is an example GELF message payload:
"short_message": "A short message",
"full_message": "Backtrace here\n\nmore stuff",
You will need a configured Rails environment, Docker and Docker Compose on your machine to get started.
Putting you Rails app into Docker container
To get the ball rolling we need to have a working Rails app, you can use any existing app you have or create a new one:
$ rails new -d postgresql app
In order to support Docker environment you should change your development environment in app/config/database.yml (don’t forget about other environments if you decide to host your app elsewhere!):
We also need to set up JS runtime for our app — just find the following line in your Gemfile and uncomment it:
gem ‘therubyracer’, platforms: :ruby
Add Dockerfile to the app folder with the following content:
ENV RAILS_ENV development
ADD Gemfile Gemfile.lock /app/
RUN bundle install -j5 --retry 10
ADD . /app
Create a directory called db and add Dockerfile for postgres container:
ENV POSTGRES_USER 'postgres'
ENV POSTGRES_DB 'app_development'
When you set up these environment variables the postgres image will create a database for you.
Create file docker-compose.yml in the project root directory, please note that we are using the second version of config file:
command: rails s -b 0.0.0.0
Now we can build and run our containers using the following commands:
$ docker-compose build
$ docker-compose up
As a result you’ll be able to see your app running on http://localhost:3000 (if you’re not using native Docker yet — replace localhost with your docker machine’s IP).
Setting up ELK stack
Create a folder called logstash in the root directory and put the Dockerfile with the following content into it:
ADD logstash.conf /etc/logstash/conf.d/
We also should provide a config file called logstash.conf (it should be placed into the same directory as our Dockerfile). We are going to consume logs from the GELF Docker log driver, so we are adding it as our input, forwarding our logs to the elasticsearch output, and setting up a host for it:
hosts => "elasticsearch:9200"
Create a folder called kibana in the root directory and put the Dockerfile with the following content into it:
RUN apt-get update && apt-get install -y netcat
COPY entrypoint.sh /tmp/entrypoint.sh
RUN chmod +x /tmp/entrypoint.sh
ADD kibana.yml /opt/kibana/config/
Kibana should always start after the Elasticsearch so we have to add a custom entrypoint for our Dockerfile:
while true; do
nc -q 1 elasticsearch 9200 2>/dev/null && break
Our kibana.yml config file will look like this:
The final step is to add services with our containers to the docker-compose.yml:
command: logstash -f /etc/logstash/conf.d/logstash.conf
command: elasticsearch -Des.network.host=0.0.0.0
After building and running our services we’ll be able to access Kibana on http://localhost:5601. If you want you logs to be persistant — you just need to put your Elasticsearch data folder (/usr/share/elasticsearch/data) to a volume.
Sending Rails logs to Logstash
In order to forward our container logs to logstash we should just turn on GELF driver for our web container and set up a host URL:
Note: why do we use localhost instead of logstash here? The code which configures logging driver will be executed on the host machine which is not part of default network created by docker.
After restarting your services you should open your Rails app, open Kibana, navigate to the settings tab and create your index:
When index is created you’ll be able to see your first log entries in the Kibana UI. However, our current logging is not super helpful since we just sending plain text from the Rails log:
Instead we should present our log entries in JSON format and put as much useful data as we can. Folks in the wild Internet use the Lograge gem, we should add two gems to our Gemfile: lograge and logstash-event. After that we should create a new initializer called lograge.rb, which will set up an appropriate format for our logs and send them to stdout where docker will grab them and send to the GELF driver:
config.lograge.formatter = Lograge::Formatters::Logstash.new
config.lograge.logger = ActiveSupport::Logger.new(STDOUT)
In order to turn on lograge for the environment (now we want it for the development.rb) — just add the following line to the corresponding environment file:
config.lograge.enabled = true
Adding custom fiels is very simple, if you need something from the controller context — just overload the append_info_to_payload method in the ApplicationController and put everything you need to the payload, and after that you’ll be able to add it to log. For instance, if you want to send an IP where the request came from, your append_info_to_payload will be:
class ApplicationController < ActionController::Base
payload[:request_ip] = request.ip
And lograge.rb should contain:
config.lograge.custom_options = lambda do |event|
The final step is to modify our logstash.conf file to parse the incoming JSON from the message field, just add the following code after the input section:
source => "message"
remove_field => "message"
After rebuilding and starting your services you’ll see that all the data coming from the Rails container is structured and indexed by Elasticsearch:
Let’s try to search for all the GET requests. You should open the Discover tab and type a query “method:GET” to the search field. If everything is fine you’ll see all the log entries with the GET HTTP method with the highlights:
To recap, let’s revisit our container setup. User can access our Rails app on the port 3000 and Kibana on 5601. Rails app sends logs to Logstash via 12201/udp port (GELF input endpoint) and Kibana and Logstash talk to Elasticsearch via port 9200 (JSON REST API port).
You can find a sample app on Github.
How are you setting up your logging? If you have any suggestions and ideas — welcome to comments.