Building Logging System in Microservice Architecture with ELK Stack and Serilog .NET Core [Part 2]

Thanh Le
Geek Culture
Published in
14 min readMar 4, 2020

In Part 1, we walked through the importance of logging, especially in MSA and how to implement a meaningful logging system.

In this part, I will show you how to build a logging system in Microservices Architecture with Serilog .NET Core and ELK Stack.

Let’s do it!

image source: internet

But wait! Serilog and ELK Stack, what are they?

— You can ignore section below if you already know what are they.

Serilog

Serilog is a logging framework for .NET that launched in 2013. Serilog is one of the newest logging frameworks, so it takes some of the newer and more advanced features of .NET. Structured logging. Besides that, the concept of enrichment also makes Serilog unique compared to a lot of other logging frameworks (for example: log4net).

With using Serilog, it’s very easy to send our logs to different places by using simple configurations. Serilog uses what are called sinks to send our logs to a text file, database, Azure table, Amazon S3 or many other places. You can see list of provided sinks here:

ELK Stack

“ELK” is the acronym for three open-source projects: Elasticsearch, Logstash and Kibana.

Elasticsearch

Elasticsearch is an open-source, distributed, RESTful search and analytics engine. Don’t think that Elasticsearch only fits with large data as it allows you to start small, but will grow together with your business. Elasticsearch is built to scale horizontally.

Elasticsearch is a highly available and distributed search engine. Each index is broken down into shards, and each shard can have one or more replica. By default, an index is created with 5 shards and 1 replica per shard (5/1). There are many topologies that can be used:

  • 1/10: 1 shard and 10 replicas — it helps to improve search performance.
  • 20/1: 20 shards and 1 replica per shard — it helps to improve indexing performance.

Elasticsearch is API driven. Most actions can be performed using a simple RESTful API using JSON over HTTP.

You can access Elasticsearch Github to see full features and source code

Logstash

Logstash is responsible for aggregating data from different sources, processing it, and sending it down the pipeline, usually to be directly indexed in Elasticsearch.

Logstash can pull from almost any data source using input plugins, apply a wide variety of data transformations and enhancements using filter plugins, and ship the data to a large number of destinations using output plugins.

You can discover all the plugins for Logstash here:

Kibana

Kibana is a powerful visualisation tool that is integrated with Elasticsearch and uses your data which is stored in Elasticsearch clusters to create meaningful graphs and charts. Kibana’s core feature is data querying and analysis.

Furthermore, Elasticseach is an API Driven — you can use the built-in tool called “Console” of Kibana to work with Elasticsearch

Applying Serilog and ELK in MSA

In this demo, we’ll use Serilog library to write log in the system. All API requests will be logged with predefined log level in config files. ELK is used to manage centralised logging data.

Step 1: Create Buy Service, Inventory Service and Shipping Service

Let’s comeback with the case in Part 1, I already created a solution with 3 services (Buy, Inventory and Shipping).

End-users will make a call to “Buy” service, and then “Buy” service will call to “Inventory” and “Shipping” service before return result to end-users.

Buy Controller

Shipping Controller

Inventory Controller

Note: I will bypass creating services step. You can access and get source code here:

Step 2: Implement Serilog for all services

Install Serilog Packages for all services using NuGet

Serilog.AspNetCore

Serilog.AspNetCore — the main driver

Serilog.Settings.Configuration

Serilog.Settings.Configuration — the provider that reads the app setting configurations. (we will use it later)

Serilog.Sinks.Async

Serilog.Sinks.Async — an async wrapper for other sinks, especially the file sink, that reduces the overhead of logging calls by delegating work to a background thread.

Serilog.Sinks.Http

Serilog.Sinks.Http — the provider helps you to send logs via HTTP. However, we will use this Sink later when we implement ELK. The idea is that we will send logs to Elasticsearch via HTTP.

For testing Serilog, we will try to implement writing logs to file.

Serilog.Sinks.File

Serilog.Sinks.File — the provider helps you to write logs to file.

Register Serilog in your project

Update Program.cs as below to register Serilog in the project.

As we’re going to use File Sink provider, we will need to define the path (file location) to store log data. For example, “F:\\Thanh_let.txt”.

Implement Serilog to write log

Note: You can do the same with other services

Register and use Logger library using Dependency Injection (DI)

Call log method in your code logic

Let’s make a call from Postman and check the result:

Step 3: Install ELK stack

Note: In this article, I will show you how to install and configure ELK stack on Windows OS from scratch. You also can choose another way to install and configure ELK like using Docker.

Install Open JDK

You can see all versions of Open JDK here:

Choose and download version 11.0.2 (build 11.0.2+9) Windows 64bit

Extract the zip file (openjdk-11.0.2_windows-x64_bin.zip) to your location. For example: “C:\Program Files\Java\jdk-11.0.2”

Create new System variables name “JAVA_HOME” and set the value is this path “C:\Program Files\Java\jdk-11.0.2”

Click “Environment Variables…” in “Advanced” tab

Click “New..”

Set “Variable name” is “JAVA_HOME” and “Variable value” is the path to your jdk-11.0.2 folder

Edit “Path” value in “User variables for admin”

Click “New” to create new Path

Restart computer to apply new change.

After restart successfully, open CMD and type “java -version” to verify installing Open JDK.

Download and Install ELK stack

  1. Elasticsearch

You can access here to download Elasticsearch

There are a lot of options for you can choose (Linux, Docker…)

The latest version of ELK is 7.6.0, if you want to download older versions, you can access here:

Note: In this article, I will use version 7.2.0 that is also the version that i’m using in current project.

2. Kibana

Link download Kibana — 7.2.0.

Make sure that you choose Windows version

3. LogStash

Link download LogStash — 7.2.0

Make sure that you choose Zip version

Once you finish downloading, you will have 3 folders as below:

Create a folder to store ELK stack. i.e. “C:\ELK”

Extract downloaded files above to “C:\ELK”

Rename folder names to shorten names :“kibana”, “logstash” and “elasticsearch”. as we need to use these names later when installing/configure ELK.

It would take around 15 mins for extracting downloaded file.

Once extracting is completed, let’s install/configure ELK.

Configure and Run Elasticsearch

Open “C:\ELK\elasticsearch\config\elasticsearch.yml”

This file contains settings for your elasticsearch server.

NOTE: Elasticsearch comes with reasonable defaults for most settings. Before you set out to tweak and tune the configuration, make sure you understand what are you trying to accomplish and the consequences.In this article, I’m not going to explain one by one parameter in the config file instead I will use settings that I’m using in my project.

  • network.host: Set the bind address to a specific IP (Make sure that you change this value to your IP)
  • http.port: Set a custom port for HTTP
  • node.data: node data will be automatically created once you insert data to elasticsearch if you don’t set the value for it. But you couldn’t configure username/password if you don’t have any node data that is why we will need set its value is true.
  • discovery.seed_hosts: Pass an initial list of hosts to perform discovery when this node is started

By default, you couldn’t find these properties in config file of Elasticsearch (xpack.security.enabled and xpack.security.transport.ssl.enabled). By default, you don’t need username/password to access Elasticsearch. But if you want your logging system more secured you should set these values to “true”.

Open PowerShell with Administrator mode and open Elasticsearch folder

And run following command : “.\bin\elasticsearch.bat” to run Elasticsearch on your machine (don’t close the terminal if you want to keep Elasticseach is running)

Let’s make a sample call to our Elasticsearch server.

OK, it’s working! Now we need to setup username/password for using Elasticserver.

Open an other Windows Powershell with Administrator and open Elasticsearch folder. Then run the following command to setup your password:

“./bin/elasticsearch-setup-passwords.bat interactive”

You will need to set password for all services of ELK stack with default username is “elastic

Let’s make a sample call to Elasticsearch again with the credential that you’ve just setup

Next, let’s configure and run Kibana

Open “ C:\ELK\kibana\config\kibana.yml” with text editor. You can see description for all settings. Based on the requirement that you can create/modify properties by yourself. Again, you should make sure you know what you are doing!

Here are my settings:

Input elastic username and password with the value that you’ve just setup in previous step.

*.encryptionKey length must be at least 32 characters long

Open an other PowerShell windows and choose “C:\ELK\kibana” folder then run following command “.\bin\kibana.bat” to run Kibana. It takes 1–2 minutes to start Kibana

Let’s open browser and type “ http://192.168.2.15:25601/” then enter username/password (elastic/Abcd@123$)

Note: we will comeback to Kibana later, once we have some log data :)

Next, we will setup Logstash — the last one in ELK stack

Open “C:\ELK\logstash\config\logstash.yml” to configure Logstash

Same with Elasticsearch and Kibana, you also can find fully description for each setting.

Bind port for the metrics REST endpoint, this option also accept a range (29600–29700) and logstash will pick up the first available ports. These ports are for Logstash running and processing data.

Create the filter for Logstash

Create new config file named “testlog.conf” in “C:\ELK\logstash” with the content below

Note: You can have more filter rules. With the setting above, I want to duplicate field “events” to “e” and then remove fields: events, headers.

You can find more details about setup Logstash filter here:

Run Logstash with the config we’ve just created (testlog.conf)

Open PowerShell windows and open following folder (“C:\ELK\logstash”)

Then run following command: “.\bin\logstash.bat -f testlog.conf

At this step, we finished installing and configure ELK stack. Now, we will come back with our code base that we’ve created in earlier steps to send the log data to ELK stack.

All services should be updated to use HTTP Sink to send log data to Logstash as below:

Let’s make a sample call a “Buy” service and see the results on Logstash and Elasticsearch

Check the result on Elasticsearch

Check the result in Logstash

You also can search Elasticsearch data on Kibana

Setup Index-pattern on Kibana

Finish

Go to the “Discover” tab, you should see the log data that based on your index

Create new your Discover and add columns that you want from log data that stored in the Elasticsearch

Note: We will use this discover when we config dashboard for Kibana.

You can also see the log details

Wait… The log details don’t include “CorrelationID”. If you don’t remember what is CorrelationID and why do we need it, you should come back to Part 1

Let’s implement CorrelationID for all services. We will need to enrich log data. (You can add other properties like ClientIP…)

The idea is we will add CorrelationID for every HTTP request using HTTP Header.

Update Startup.cs file to add new service

And update Configure function

Create a Logging Extension

Now, update Buy service to get “CorrelationID” header then add to Ship and Export request

Update Program.cs to use CustomLogEnricher

Once you update log data structure, you should update Index Pattern in Kibana

Let’s make a call to Buy service again, make sure you add CorrelationID in the request header.

Check the log data on Kibana

OK, so far so good!

Let’s config Kibana dashboard by using the Discover that we have created previously

Here is the result

Save your dashboard

You can try to create new chart visualizations by following this article from elastic.co

Finally, It’s over!

Conclusion

Although this is a long article, however it’s worth, isn’t it? Through this article, I want to give you some ideas to create a meaningful and robust Logging system in Microservices Architecture by using ELK stack and Serilog .NET Core.

If you have any feedback/concern, feel free to drop me a message and I will try my best to reply.

Thank you for being here!

Source Code

--

--

Thanh Le
Geek Culture

A Software Technical Architect — Who code for food and write for fun :)