Monitoring Spring Application using ELK and AspectJ — Part 1

Ramiz Mehran
lazypay-techblog
Published in
3 min readApr 12, 2018

Hey guys, I’ll be getting straight down to the point. For any application, an organised logging structure is a part which is often neglected but is the most valuable one. In here we’ll be doing the following:

  1. Implementing the ELK Stack which includes: Logstash (collecting, parsing and transforming logs), ElasticSearch (restful, distributed analytics and search engine), and Kibana (explore, visualise and discover data).
  2. Building a spring based project, which can be attached as a dependency jar to any project and will do the job of logging metrics for any controller just by merely appending an annotation over it.
  3. (Optional) An application to take requests and poll them to logstash, in case you do not wish to change the architecture of existing projects.

I. Implementing ELK Stack

Implementing ELK stack is quite easy, as Elastic.co has all the directions one needs to get it downloaded and working. Still, I would give a step by step process here:

Start by downloading ElasticSearch from here!

  1. Just extract the above file, and go inside the extracted folder.
  2. Run bin/elasticsearch (or bin\elasticsearch.bat on Windows)
  3. Your ElasticSearch is running and listening on port 9200 by default. One can change the port by simply updating the “http.port:” value in config/elasticsearch.yml (The property is usually commented, remember to uncomment it after updating).

Moving on to Logstash, download it form here!

  1. Extract the downloaded file, and go inside the extracted folder.
  2. Create a PipeLine config file say, logstash.conf (The pipe line configuration file is actually used to define the basic pipeline for logstash. Will explain the details of it below.)
  3. Start logstash by callingbin/logstash -f logstash.conf.
  4. Your logstash is running on port defined in the PipeLine config file (if defined).

Finally, Kibana. You can get it from here!

  1. Extract the downloaded file, and go inside the extracted folder.
  2. Check the config file: config/kibana.yml once to make sure it has the right “elasticsearch.url:” property value. (Usually, it is correct. Same as the port where elasticseach is deployed, as long as it is in the same machine).
  3. Now, Run bin/kibana (or bin\kibana.bat on Windows).
  4. Your ELK Stack is up and running. You can visit Kibana at http://localhost:5601 to checkout the dashboard.

Basic PipeLine Config for Logstash:

input { stdin { } }
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

The above mentioned configuration will start taking input from your keyboard as soon as you start logstash. You can take it as a measure to make sure everything is running. (Once u exit this in command line, your logstash will shut down.)

Up and Running?

Now, to test if all the three services are up and perfectly in sync, do the following:

ElasticSearch: Visit http://localhost:9200/ to get the below output.

{
"name" : "zmNA9Tq",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "fpZqZyQFRvGhEmfT2gruUw",
"version" : {
"number" : "6.2.1",
"build_hash" : "7299dc3",
"build_date" : "2018-02-07T19:34:26.990113Z",
"build_snapshot" : false,
"lucene_version" : "7.2.1",
"minimum_wire_compatibility_version" : "5.6.0",
"minimum_index_compatibility_version" : "5.0.0"
},
"tagline" : "You Know, for Search"
}

Logstash: Should give a log saying connection established to ES Instance.

Kibana: Just visit http://localhost:5601

TIP: To start the services in background and poll their logs into a file, use the below command (Linux and Mac). All commands are run from the home directory of the services, the ones you extracted before.

bin/elasticsearch > runningStatus.log 2>&1 &bin/kibana > runningStatus.log 2>&1 &bin/logstash -f logstash.conf > runningStatus.log 2>&1 &

Conclusion

The above configurations are done keeping in mind that all the three applications are running in the same machine. If you wish to deploy these applications in different servers, just update the localhost with the appropriate server IP.

Our implementation specific configurations for Logstash will come in later parts of the article.

Special thanks to Abhishek Gupta for guiding me through it.

For the next part of the article, head here —

--

--

Ramiz Mehran
lazypay-techblog

The best version of pro-geek (Professional Geek :P). Likes to read, and experiment with latest trends. A knack for critical thinking and love for anything tech.