Monitoring Spring Application using ELK and AspectJ — Part 1

Ramiz Mehran
Apr 12, 2018 · 3 min read

Hey guys, I’ll be getting straight down to the point. For any application, an organised logging structure is a part which is often neglected but is the most valuable one. In here we’ll be doing the following:

Image for post
Image for post
  1. Implementing the ELK Stack which includes: Logstash (collecting, parsing and transforming logs), ElasticSearch (restful, distributed analytics and search engine), and Kibana (explore, visualise and discover data).
  2. Building a spring based project, which can be attached as a dependency jar to any project and will do the job of logging metrics for any controller just by merely appending an annotation over it.
  3. (Optional) An application to take requests and poll them to logstash, in case you do not wish to change the architecture of existing projects.

I. Implementing ELK Stack

Implementing ELK stack is quite easy, as Elastic.co has all the directions one needs to get it downloaded and working. Still, I would give a step by step process here:

  1. Just extract the above file, and go inside the extracted folder.
  2. Run bin/elasticsearch (or bin\elasticsearch.bat on Windows)
  3. Your ElasticSearch is running and listening on port 9200 by default. One can change the port by simply updating the “http.port:” value in config/elasticsearch.yml (The property is usually commented, remember to uncomment it after updating).

  1. Extract the downloaded file, and go inside the extracted folder.
  2. Create a PipeLine config file say, logstash.conf (The pipe line configuration file is actually used to define the basic pipeline for logstash. Will explain the details of it below.)
  3. Start logstash by callingbin/logstash -f logstash.conf.
  4. Your logstash is running on port defined in the PipeLine config file (if defined).

  1. Extract the downloaded file, and go inside the extracted folder.
  2. Check the config file: config/kibana.yml once to make sure it has the right “elasticsearch.url:” property value. (Usually, it is correct. Same as the port where elasticseach is deployed, as long as it is in the same machine).
  3. Now, Run bin/kibana (or bin\kibana.bat on Windows).
  4. Your ELK Stack is up and running. You can visit Kibana at http://localhost:5601 to checkout the dashboard.

input { stdin { } }
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

The above mentioned configuration will start taking input from your keyboard as soon as you start logstash. You can take it as a measure to make sure everything is running. (Once u exit this in command line, your logstash will shut down.)

Up and Running?

Now, to test if all the three services are up and perfectly in sync, do the following:

Visit http://localhost:9200/ to get the below output.

{
"name" : "zmNA9Tq",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "fpZqZyQFRvGhEmfT2gruUw",
"version" : {
"number" : "6.2.1",
"build_hash" : "7299dc3",
"build_date" : "2018-02-07T19:34:26.990113Z",
"build_snapshot" : false,
"lucene_version" : "7.2.1",
"minimum_wire_compatibility_version" : "5.6.0",
"minimum_index_compatibility_version" : "5.0.0"
},
"tagline" : "You Know, for Search"
}

Should give a log saying connection established to ES Instance.

Just visit http://localhost:5601

To start the services in background and poll their logs into a file, use the below command (Linux and Mac). All commands are run from the home directory of the services, the ones you extracted before.

bin/elasticsearch > runningStatus.log 2>&1 &bin/kibana > runningStatus.log 2>&1 &bin/logstash -f logstash.conf > runningStatus.log 2>&1 &

Conclusion

The above configurations are done keeping in mind that all the three applications are running in the same machine. If you wish to deploy these applications in different servers, just update the localhost with the appropriate server IP.

Our implementation specific configurations for Logstash will come in later parts of the article.

Special thanks to Abhishek Gupta for guiding me through it.

For the next part of the article, head here —

lazypay-techblog

A peak inside the exciting work being done towards building…

Ramiz Mehran

Written by

The best version of pro-geek (Professional Geek :P). Likes to read, and experiment with latest trends. A knack for critical thinking and love for anything tech.

lazypay-techblog

A peak inside the exciting work being done towards building the first one tap payment solution in India

Ramiz Mehran

Written by

The best version of pro-geek (Professional Geek :P). Likes to read, and experiment with latest trends. A knack for critical thinking and love for anything tech.

lazypay-techblog

A peak inside the exciting work being done towards building the first one tap payment solution in India

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store