ELK Stack Docker Playground for DevOps

imran baig
tajawal
Published in
3 min readApr 15, 2018

The ELK Stack (Elasticsearch, Logstash, and Kibana) is the most popular open source log analysis platform. ELK is quickly overtaking existing proprietary solutions and has become the first choice for companies shipping for log analysis and management solutions.

While @ Tajawal we are able to achieve proximity to logs which has helped quickly address and resolve and close the gaps.

So What is ELK?

Credit: Howtodoinjava.com

The ELK stack is comprised of three separate yet alike open-source products: Elasticsearch, probably the more well-known of the three components is the search engine part (Elasticsearch) that powers the stack., Elasticsearch is based on Apache Lucene, Elasticsearch is full-text search engine which used to perform full-text and other complex searches.

Logstash processes the data before sending it to Elasticsearch for indexing and storage.

Kibana is the visualization tool with which you can view the log messages and create graphs and visualizations.

Setting up Local ELK Docker Playground

While you want to set up ELK Stack locally, When you want to run simulations or tests of different tools log analysis before moving to development or production environment before your Boss says:” Let’s deploy your test analysis now after that demo!!

Setting up ELK is uphill battle. While to make it easy follow the below steps:

  1. Clone this Docker ELK repo
$ git clone https://github.com/caas/docker-elk.git

2. Start the ELK stack container using docker-compose

$ cd dockers-elk
$ docker-compose up

Verify:

Use docker ps command to check whether all the containers are running or not.

Give Kibana a few seconds to initialize, then access the Kibana web UI by hitting http://localhost:5601 with a web browser.

By default, the stack exposes the following ports:

  • 5000: Logstash TCP port
  • 9200: Elasticsearch HTTP
  • 9300: Elasticsearch TCP transport
  • 5601: Kibana

3. Set up Index pattern, Kibana have restful API so from the Command line you use CURL to communicate with Kibana API like this:

$ curl -XPOST -D- ‘http://localhost:5601/api/saved_objects/index-pattern' \
-H ‘Content-Type: application/json’ \
-H ‘kbn-version: 6.2.3’ \
-d ‘{“attributes”:{“title”:”logstash-*”,”timeFieldName”:”@timestamp”}}'

Now your playground is all ready to start shpping

4. Add your filter

Under following path : dockers-elk/logstash/pipeline/logstash.conf You can add your filter in following format:

https://gist.github.com/b05f87a9a4b664afd4bd38cbec4e1dc9.git

Or you can send your Log files through NC command directly to Elasticsearch.

cat yourfile.log | nc localhost 5000

Well, Now you can see those logs hitting on kibana. Click on Discover to start searching the latest logs.

--

--