Logging gRPC on .NET Using Serilog
A tutorial for logging gRPC on .NET using Serilog and ElasticSearch
In this tutorial, I will show how to log gRPC events on both client and server using Serilog which provides structured event data, enrichment capability and exporter. I will add a custom metadata in a gRPC client and pass to server within a request/response lifecycle. We will follow these steps:
- Run ElasticSearch and Kibana on Docker
- Create gRPC .NET Client
- Create gRPC ASP.NET Server
- Configure index pattern on Kibana
Prerequisites
To follow along, make sure you have the following installed:
- Docker
- Visual Studio Code
- .NET 6 SDK
Step 1: Run ElasticSearch and Kibana on Docker
- In VSCode terminal, create project directory with
mkdir elastic-grpc
command navigate to directory withcd elastic-grpc
. Before we start coding, we need to run the elasticsearch and kibana containers. The easiest way is to create a docker compose file. Create a new file nameddocker-compose.yml
.
- Run the containers with
docker-compose up -d
command. The first time you run the docker-compose command, it will download the images for ElasticSearch and Kibana from the docker registry, so it might take a few minutes depending on your connection speed. To check container’s status, executedocker-compose ps
command.
- Then, type
curl http://localhost:9200
command on terminal and verify that Elasticsearch is up and running. Navigate tohttp://localhost:5601
and verify that Kibana is up and running.
Step 2: Create gRPC ASP.NET Core Server
- Generate gRPC server from template using
dotnet new grpc -o grpc-server
command. Since I build on MacOS, there is a known issue with Kestrel host making HTTP/2 calls over SSL, so I will configure WepApplicationBuilder in Program.cs to setup a HTTP/2 endpoint without SSL.
- Add Serilog NuGet packages to the project using command below and add necessary namespaces in Program.cs. Serilog provides a structured data logger and send bulk payloads over HTTP to ElasticSearch.
dotnet add package Serilog.AspNetCore
dotnet add package Serilog.Enrichers.Environment
dotnet add package Serilog.Sinks.Debug
dotnet add package Serilog.Sinks.Elasticsearch
Before we proceed further, terms that Serilog uses must be clarified.
- Sink: This is basically a writer that sends log to its corresponding backend, database e.g. A list of available Sinks can be found on Serilog’s GitHub page.
- LogContext: A property bag that can be used to dynamically add and remove properties on application runtime.
- Enricher: A log context extender that collects specific fields from execution context and pushes to log context. A list of available Enrichers can be found on Serilog’s GitHub page.
The logger has pre-built enrichment feature that collects properties from log context. This feature can be enabled using .FromLogContext()
method of Enrich configuration object.
- Configure Kestrel WebHost’s logger using
ConfigureLogging()
method and set Serilog’s logger.
The enrichment accessor needs to access context to obtain identifier. Note that the enricher’s
WithCorrelationIdHeader()
method takes custom key as input and searches request header with this key, otherwise generates a new identifier.
You can add any custom key to header metadata. I will use Correlation-Id
. Since the identifier is generated on client side, the logger reads the value from request header and writes it to log context. We can also use appsettings.json
to populate LoggerConfiguration
object. This way is neater than configuring by methods.
- Add HTTP context accessor to service using
ConfigureServices()
method.
- Overall server looks like below.
- Build and run gRPC server using
dotnet run
command. Service will be exposed on a random port, so check console log for listening port. We will set this in client later.
Step 3: Create gRPC .NET Core Client
- Create new console app from template using
dotnet new console -o grpc-client
command and add required NuGet packages with command below.
dotnet add package Grpc.Net.Client
dotnet add package Google.Protobuf
dotnet add package Grpc.Tools
- Create a folder named
Protos
and copy thegreet.proto
file from grpc-server. Change the namespace inside the proto to the project’s namespace:
option csharp_namespace = “grpc_client”;
- Add Item Group including proto file in grpc-client.csproj file:
<ItemGroup>
<Protobuf Include=”Protos\greet.proto” GrpcServices=”Client” /> </ItemGroup>
- To make insecure call, I will add the switch to AppContext.
According to official documents, this switch is only required for .NET Core 3.x. It does nothing in .NET 5 and isn’t required.
- Set the endpoint address which we obtained from server and check it.
- Add Serilog NuGet packages to client project using command below.
dotnet add package Serilog.AspNetCore
dotnet add package Serilog.Enrichers.Environment
dotnet add package Serilog.Sinks.Debug
dotnet add package Serilog.Sinks.Elasticsearch
- To get logs from gRPC .NET client, we need to set the
GrpcChannelOptions.LoggerFactory
property to an instance ofILoggerFactory
by initializing a newSerilogLoggerFactory
class. We need to call a proper extension method provided by the third party logging framework that works with ASP.NET Core Logging framework. You can also use DI to resolve logger factory.
- Add new
Metadata.Entry
to client to set theCorrelation-Id
in request header. I will generate a new guid just for demo. If you use it in a middleware, you can pass the incoming value.
- Overall client looks like below.
Step 4: Configure index pattern on Kibana
- Now we can configure Kibana and see the results. First, we need to add index. Open Kibana and navigate to Create Index Pattern section. Add index pattern named
grpc-server*
for server events andgrpc-client*
for client events. After creating index, navigate to Discover section under Analytics and you will see the list of logs.
- You can see the metadata fields and add columns. Add CorrelationId column. Logs that contains correlation id are related to request/response, others are related to server events.
Conclusion
This tutorial will help you to understand basic concept of correlating activities across multiple services. Correlating activities is one of the distributed tracing solution to help individuals tracing transactions and analyzing root cause. There are powerful distributed tracing tools e.g. Zipkin, OpenTelemetry, Jaeger.
You can access source code at my GitHub.
References: