Getting ELK up and running on Linux server

Salohy Miarisoa
3 min readJun 11, 2018

--

ELK logo

ELK stands for Elasticsearch, Logstash and Kibana

Elasticsearch: a highly scalable open-source full-text search and analytics engine. It allows you to store, search, and analyze big volumes of data quickly and in near real time.

Logstash: Tool for managing events and logs. You can use it to collect logs, parse them and store them for later use (like for searching) Elasticsearch

Kibana: Visualization plugin for Elasticsearch

Install Prerequisite: java 8

sudo apt-get update && sudo apt-get install default-jre

Install ELK

  1. All packages from ELK are signed with asymmetric key, so we need to get the public key in order to lock them.
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -

2. With Debian, we need to install apt-transport-https

sudo apt-get install apt-transport-https

3. Save the repository definition (at the moment of writing, the last version is 6.2.4)

echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list

4. Install the Elasticsearch Debian package

sudo apt-get update && sudo apt-get install elasticsearch

5. Install Logstash

sudo apt-get install logstash

6. Install Kibana

sudo apt-get install kibana

Configure ELK

  1. Configure Elasticsearch

The configuration file is located at /etc/elasticsearch/elasticsearch.yml

Open it, under the Network section, specify your network host and the http port

2. Configure Logstash

The configuration of Logstash is a bit different that the configuration of elasticsearch and kibana. Under /etc/logstash/conf.d/ , we create our config files, which get the log data from other sources. The way we write the config file must follow the logstash anatomy:

  • Input (required): The source of logdata, that logstash wants to collect
  • Filter (Optional): Transform logdata in more simpler structure
  • Output (required): Define the output of data

There are lots of plugins that will help us, achieving the purpose of each step. List all available plugin and install them as follow:

Below is an example of how a log file of logstatsh looks like. We get the input data in json format from a kafka topic, filter it with the logstash-filter-json and send it to elasticsearch with a specific index.

3. Configure Kibana

The configuration file is located at /etc/kibana/kibana.yml

The most important config to enable are, so open the config file and make your config

For remote call of kibana, assure that port 5601 is open

List the open port on your machine

sudo ufw status verbose

If 5601 is not in the list, open it with:

sudo ufw allow 5601/tcp

4. Start ELK

Start elasticsearch first with

sudo service elasticsearch start

Start Logstash

sudo service logstash start

Start Kibana

sudo service kibana start

5. Start your search

open the kibana url in your browser http://your-kibana-ip:5601

6. You get in trouble? nothing works?

Check out the log data of logstash. You should get some information there, that can help you fix your problem.

sudo tail -f /var/log/logstash/logstash-plain.log

--

--