Spring Boot Logs with Elasticsearch, Logstash and Kibana(ELK)

Hiten Pratap Singh
hprog99
Published in
4 min readSep 28, 2018

--

Logging in an application is an important part, but it becomes more important when we talk about it in a distributed environment. There can be a many scenarios when centralized logging becomes necessity rather than just a choice like microservice architecture or having multiple instances running with the help of load balancer.

So here we will be going to see how all it can be achieved using the fabulous ELK (Elasric, Logstash and Kibana) stack using Spring Boot.

To achieve this thing, We need to put several pieces together like below:

ElasticSearch

First we must need to make sure that ElasticSearch is installed and is up and running.

To install it, Just follow the instructions given on its website as per your plateform:

After following all the steps given there, Just check it using the below command:

curl -XGET http://localhost:9200

You will get something like below output:

{
"name" : "R7jpDiu",
"cluster_name" : "elasticsearch_hitenpratap",
"cluster_uuid" : "7i4y982XQcCUZWUMB9f2FA",
"version" : {
"number" : "6.2.4",
"build_hash" : "ccec39f",
"build_date" : "2018-04-12T20:37:28.497551Z",
"build_snapshot" : false,
"lucene_version" : "7.2.1",
"minimum_wire_compatibility_version" : "5.6.0",
"minimum_index_compatibility_version" : "5.0.0"
},
"tagline" : "You Know, for Search"
}

Logstash

Now we have to make sure about another piece is installed and is up and running as well. You can install Logstash by following the link below:

Kibana

Now comes the last component of ELK stack i.e. Kibana. You can install it using the link below similarly you have been doing for previous components.

Point your browser to http://localhost:5601 (if Kibana page shows up, we’re good — we’ll configure it later)

After having done all these things, You just have to build an application in spring boot with a special file i.e. logstash.conf which will be used to configure logstash to send logs to elasticsearch. We’ll see all these things now.

We need to perform the following actions to get our Spring Boot applications talking to ELK stack.

Configure Spring Boot’s log file

To have Logstash to ship logs to ElasticSearch then we need to have our application to store logs in a file. There are obviously other ways to achieve the same thing as well i.e. To ship logs to ElasticSearch using Logstash such as such, as configuring logback to use TCP appender to send logs to a remote Logstash instance via TCP. Anyhow, let’s configure Spring Boot’s log file. The simplest way to do this is to configure log file name in application.properties. It’s enough to add the following line:

logging.file=application.log

Configure Logstash to Understand Spring Boot’s Log File Format:

This is the most tricky and important part of all as this step is like the bridge between the Spring Boot application and the ELK stack. Logstash file consists of three section: input, filter and output. Each section contains plugins that do relevant part of the processing (such as file input plugin that reads log events from a file or elasticsearch output plugin which sends log events to Elasticsearch).

Finally, the three parts — input, filter and output — need to be copy pasted together and saved into logstash.conf config file. Once the config file is in place and Elasticsearch is running, we can run Logstash:

/path/to/logstash/bin/logstash -f logstash.conf

Configure Kibana

Now it’s time to visit the Kibana web UI again. We have started it in previous step and it should be running at http://localhost:5601.

First, you need to point Kibana to Elasticsearch index(s) of your choice. Logstash creates indices with the name pattern of index defined in logstash.conf file. In Kibana Settings → Indices configure the indices:

  • Index contains time-based events (select this option)
  • Use event times to create index names (select this option)
  • Index pattern interval: Daily
  • Index name or pattern: What’s defined in logstash.conf file.
  • Click on “Create Index”

Now click on “Discover” tab. Log events should be showing up now in the main window. If they’re not, then double check the time period filter into right corner of the screen. Default table will have 2 columns by default: Time and _source. In order to make the listing more useful, we can configure the displayed columns. From the menu on the left select level, class and logmessage.

Alright! You’re now ready to take control of your logs using ELK stack and start customizing and tweaking your log management configuration. You can download the sample application from the link below with logstash.conf file inbuilt in it. Just configure the logstash.conf file as epr your need and you are ready to go.

https://github.com/hitenpratap/spring-boot-elk

Questions? Comments? Let me know in the comments!

--

--