Auto-load objects to Kibana-Elasticsearch stack— 2nd edition

Gilad Neiger
Develeap
Published in
5 min readAug 1, 2022

Introduction

More than a year ago, I wrote a blog post about importing Kibana objects automatically using scripts in ConfigMaps & Helm Jobs. However, small things have changed and I also found some places I could improve the examples & scripts in the blog post.
So here we are — Auto-load objects to Kibana-Elasticsearch stack, the 2nd edition.

If you want to begin from the previous version of the blog post, visit here: https://medium.com/develeap/kibana-objects-exporting-importing-c427a8eb92e9

The well-known EFK/ELK stack, which includes Elasticsearch, Fluentd (or Logstash) & Kibana, is a standard centralized logging platform nowadays.
If you had deployed this stack on your Kubernetes cluster, at some point, you might ask yourself — “Can I export these dashboards out?” or “Can I import ‘built-in’ dashboards & index-pattern (data view) to a brand new Kubernetes cluster?” or “Can I have a one-click EFK stack installation including all the dashboards configured in place?” so yes, you can.
This is exactly the challenge I faced more than a year ago while deploying this kind of stack to a Kubernetes cluster and I’m still facing it as of today.

In the following blog post, I’ll show you how to export and import Kibana dashboards and index-pattern (data view), manually using the Kibana API & automatically using ConfigMaps & Helm Jobs so that when you’ll deploy your applications and your logging tools, you will deploy also ‘built-in’ Kibana dashboards within your EFK stack.

UPDATE: index-pattern — > data view

Please note that Elastic has renamed the ‘index-pattern’ object to ‘data view’. However, we can still use the ‘index-pattern’ name when we will import it later using the Kibana API.
In this blog post, from now on, I’ll use the name ‘data view’.

(https://www.elastic.co/guide/en/kibana/current/index-patterns.html)

Kibana API

You can use Kibana’s API for plenty of purposes, such as import/export data views & dashboards. This is perfect for configuring Kibana in an automation way like we love to do in DevOps.

You should be aware that by Kibana’s documentation: “Each API is experimental and can include breaking changes in any version of Kibana, or might be entirely removed from Kibana” — so be aware.

Moreover, down below you’ll find some ‘missing’ variables, such as ${KIBANA_PORT} or ‘user:pwd’ on the CURL commands — make sure you’re replacing them with your relevant data.

Export data view using Kibana API

So, first of all, we need a data view to index our logs. In my use-case I had a data view that we used for our case. I wanted to export it so I’ll be able to import it later whenever I want.
We can do so using the Kibana API. This is pretty straightforward, just follow the command below (make sure you are replacing ${KIBANA_URL}, ${KIBANA_PORT}, user:pwd, and lastly the ${DATA_VIEW_ID}):

You can get your data view ID by navigating to the data views web page on Kibana, and you’ll see the ID at the address bar as a part of the URI:

You’re expected to get a NDJSON output, save it as a file, you’ll use it later.

Exporting Kibana Dashboards using Kibana API

Now, after we exported the data view, we would like to export our dashboards the same way (make sure you are replacing ${KIBANA_URL}, ${KIBANA_PORT}, user:pwd, and lastly the ${DASHBOARD_ID}):

You can get the ID by navigating to the dashboard web page on the Kibana UI and you’ll find the ID at the address bar as a part of the URI.

Import data-view using Kibana API

After you exported your data-view by the command I mentioned above, you can now import it using the following command:

The ${DATA_VIEW_FILE} is the file name that you have exported the data view into.

Import dashboards using Kibana API

Of course, you would like to show some data in dashboards, and fortunately, you have a dashboard you’ve exported just before by the steps above. Now you just need to run the command:

*Optionally — you can curl command with -d ‘ <JSON_DATA>’ instead of file path.

Import as a part of your Helm chart

So the exciting part is here: sometimes we want to import data views and dashboards as a part of the Helm chart itself, or maybe as a post-process that happens after the Helm chart installation.
It can be done by Helm Jobs. We will also use Kubernetes ConfigMaps in order to bring our objects' JSON data inside the container.

Dashboards & data views as ConfigMap

Firstly, we’ll create a ConfigMap which includes:

  1. Shell script that curling the Kibana API (to import the data views & the dashboards)
  2. A JSON file of a dashboard
  3. A NDJSON file of a data view

The next step is to mount this ConfigMap as 3 different files inside the Helm Job’s container, follow me.

Helm job as your handler!

We’ll now use a Helm job in order to:

  1. Mount this ConfigMap to a container
  2. Run the Shell script in order to send the API requests to the Kibana

Please note:

  • Obviously, you need a ready Kubernetes cluster.
  • You can add the ConfigMap and the Helm job to your EFK stack helm chart, or you can just run it statically at your cluster — this is really your choice following your needs.

This job will run a single pod which will go down right after it finishes its task: Importing Kibana data view & dashboards.

Summing up

To sum up, it would be much better to have an option to import data views and dashboards using Kubernetes ConfigMaps, but as we can’t do so today, I find the solution I described here very useful in order to add an import functionality to your Helm chart, and allow your developers to control their built-in dashboards & data views from the ConfigMap.
On one hand, you give them the control but on the other hand, you do the import progress for them, using Helm job.
You can also decide how you want to run this Helm jo. In my case, I decided to add this Job to my ECK helm chart, and then when I deployed the EFK stack I already had some built-in dashboards & data view, but you can obviously decide how you want to use it.

--

--

Gilad Neiger
Develeap

DevOps Group Leader, DevOps professional & 日本語の学生