Monitor ASP.NET Core in ELK through Docker and Azure Event Hubs

  • track what is going on on your system,
  • code complex alert rules,
  • and see at a glance how your application is performing
Kibana integrates stunning visuals with a powerful and flexible query language
  • Serilog in ASP.NET Core as our client-side logger
  • Azure Event Hubs as the delivery infrastructure for log entries
  • A full ELK stack (ElasticSearch, LogStash and Kibana) running in Docker containers

Designing our logging infrastructure

Strictly speaking, Kibana only needs a working ElasticSearch cluster — even just a single server — to properly function. However, getting data into the search engine is all but trivial.

  • They are supported by LogStash, through a specific input provider built by the Azure team;
  • If you use Serilog in ASP.NET Core, there’s a sink that integrates with Event Hubs
  • Since they behave like a queue, there’s no need of physical connectivity between the servers running ASP.NET Core and the ones with ELK.

Logging from ASP.NET Core to Event Hubs

ASP.NET Core Logging API doesn’t support Azure Event Hubs out of the box. However, there are several providers out there that implement the ILoggerFactory interface and can seamlessly fit into the general runtime architecture. One of theme is Serilog. So, first step is adding the Serilog NuGet package, together with its sink for Event Hubs:

Install-Package Serilog.Extensions.Logging
Install-Package Serilog.Sinks.AzureEventHub
Connection strings for the event hub

Use LogStash with Docker and Azure Event Hubs

If you have a look at the documentation in the Elastic website, there are several ways to install and execute the ELK stack locally. Personally, I’m a total fan of the Docker option, the main reason being that ELK is quite heavyweight, and I love to have the chance to spin it up and tear it down with just a couple of commands on Docker CLI.

FROM docker.elastic.co/logstash/logstash:6.2.3WORKDIR /plugin-installCOPY ./plugins .RUN logstash-plugin install logstash-input-azureeventhub
  1. we use the official LogStash image as a base
  2. we create a /plugin-install folder in which we copy the source code of the plugin downloaded from GitHub
  3. we run the logstash-plugin command to install it in LogStash
docker build -t logstash-eh .

Configure the ELK cluster in Docker

As you’ve probably figured out by now, an ELK cluster is basically made of three different containers running together and interacting with each other:

  • LogStash will be our log collector: it will download data from Azure Event Hubs and push it to ElasticSearch
  • ElasticSearch indexes the log data and ensures we have the maximum flexibility and speed when creating our dashboards or querying it
  • Kibana is a gorgeous front end for all of the above.
elastic:
image: 'docker.elastic.co/elasticsearch/elasticsearch:6.2.3'
volumes:
- esdata:/usr/share/elasticsearch/data
ports:
- 9200:9200
- 9300:9300
kibana:
image: 'docker.elastic.co/kibana/kibana:6.2.3'
environment:
ELASTICSEARCH_URL: http://elastic:9200
ports:
- 5601:5601
depends_on:
- elastic
logstash:
image: logstash-eh
environment:
XPACK_MONITORING_ELASTICSEARCH_URL: http://elastic:9200
volumes:
- c:/Users/marco/Desktop/elk-test/logstash-cfg:
/usr/share/logstash/pipeline/
depends_on:
- elastic
input
{
azureeventhub
{
key => "giS1etkQ..."
username => "defaultPolicy"
namespace => "destesthub"
eventhub => "samplehub"
partitions => 2
}
}
output {
stdout { }
elasticsearch {
hosts => ["http://elastic:9200"]
}
}
  • stdout that logs tothe console, useful if we want to make sure that LogStash is effectively receiving messages;
  • elasticsearch, that ships the data to the indexer.

Brilliant, time to see it running!

The full docker-compose.yml file is similar to the one below:

A very simple dashboard in Kibana, but still stunning!

Conclusions and going to production

During this article we’ve presented a solution to monitor an ASP.NET Core application using the ElasticSearch-LogStash-Kibana stack. The ultimate result in undoubtedly interesting, although there are still a few limitations and steps to go in order to be production ready:

  • the data reaches ELK through Azure Event Hub. This is definitely a scalable solution, although the provider we’ve used has some major limitations. The biggest one is that it doesn’t use the EventProcessorHost, which means that there’s no support for lease management and offset storage. This is a major issue, which prevents you from having many concurrent LogStash instances and might cause messages to be lost should the container restart. I’m probably going to spend some time and write a new provider soon. :)
  • ElasticSearch is memory hungry. It has some tough system requirements and it’s probably advisable to deploy the ELK system in a stand-alone cluster, perhaps used by multiple applications. I also recommend to carefully read the documentation before going to production.
  • Data will keep building up over time. We need a system to archive old data on a regular basis. Curator is probably the most widely used tool to achieve this, although it’s out of scope for this article.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store