5/5 - (1 vote)

Logging is a very important tool for a developer, but when creating distributed systems it becomes a stone that needs to be laid right in the foundation of your application, otherwise the complexity of developing microservices will make itself felt very quickly.

.Net Core 3 added the excellent ability to convey the correlation context in HTTP headers, so if your applications use direct HTTP calls for interservice communication, then you can take advantage of this boxed functionality. However, if the architecture of your backend involves interaction through a message broker (RabbitMQ, Kafka, etc.), then you still need to take care of the topic of transmitting the correlation context through these messages yourself.

In this article we will take a simple web-api application and organize the logging, which will:

  • save end-to-end correlation between independent service logs so that you can easily see all the activities that were caused by a specific request from the client
  • to have a single entry point with convenient analysis, so that even the Support could use the logging tool, to which questions fly like “I got an error with such a request ID in my application”

First, we need to decide on the logging provider in our application. The main requirement for modern logging is structural, i.e. we should not work with flat text messages, but with objects. Thanks to such logs, we can easily build representations of our messages in different sections and conduct analytics.

For our application, we will use the Serilog package, which has excellent support for structural logging and a rich add-on system. I will omit the basic steps for setting it up (you can find a large number of articles on this topic) and make the assumption that

The serialog is already configured and is the default logger for your dependency injection provider its configuration includes enrichment of messages with context properties (Enrich.FromLogContext).

The next step is to choose which centralized log collection system to send messages from Serilog to. Perhaps the most common open source option for today is the ELK stack (Elasticsearch, Logstash, and Kibana). To do this, we will take advantage of the offer from Logz.IO – after registering at a free tariff, in our hands is all the power of the Lucene search engine.

We only need to add the Serilog.Sinks.Logzio package to our project

And add the corresponding enricher to the configuration of our logger, giving it an access token

By launching the application, we will be able to observe our messages not only in the console, but also in Kiban.


In the service type application, two main interfaces of its interaction with the outside world can be distinguished, we designate them as vertical and horizontal. The vertical interface is the web api through which calls from the client application arrive. Horizontal is a message broker that is used to exchange data with other internal services.

Consider the stages of implementing correlation on each of these interfaces.

Correlation in HTTP requests

In order to receive as much information as possible, we need to generate a correlation identifier as close as possible to the start of activity, i.e. on the gateway or directly on the client (mobile or web). Since we are dealing with a backend application today, we simply indicate on it the requirement of the obligatory header “X-Correlation-ID” in all requests for web-api.

Add the package CorrelationID, the function of which is to select the value from the header we need

Add it to the request processing pipeline

Now, with its help, we will make a simple action filter:

And add it to the controller

As a result, the controller will output 400 Bad request to all requests without a header with the corresponding identifier.

After we began to receive the identifier from the client, we must add it to the logging context, we will make a framing layer for this:

In our application, we use the standard ILogger from the Microsoft.Extensions.Logging.Abstractions package, so we will add the value using a simple extension to it.

Add a layer to the request processing pipeline and get the desired result.

Now all the activities that are generated by requests to our web-api contain a correlation identifier by which they can be easily linked.

Correlation in broker messages

The next step, we need to arrange the transmission and reception of the correlation identifier through a message broker. In our example, we will use RabbitMQ, and as a client we will take the framework MassTransit (MassTransit). Again, let’s lower the initial configuration of working with MassTransit and proceed directly to the configuration of logging.

To begin with, we can enable the logs of MassTransit itself, for this we will add the MassTransit.SerilogIntegration package to our application

Now, after adding a logger to the MassTransit settings, we can see the logs of the framework.

Let our application send a SomethingDoneMessage event with the value “done” in response to a POST request. The contract for such a message can be described as follows:

MassTransit messages are essentially an envelope in which broker messages are embedded. The envelope looks something like this:

The message shows service fields that are necessary for the framework itself to work, but we have the ability to add our own additional properties to this envelope. Moreover, MassTransit has built-in tools for working with some optional fields, most of which we are interested in the correlation identifier CorrelationId.

Add the CorrelatedBy interface to the message contract:

We implement it and assign the value to the CorrelationId property when creating the message:

If we look at the updated message, we will see that the correlation identifier has become not only part of our message, but also part of the envelope – this identifier will now also be used in all MassTransit logs, which means it will be much easier for us to deal with problems at the message broker level.

It remains for us to configure the logging of these service properties of the message, for this we add the package Serilog.Enrichers.MassTransitMessage to the project. The package adds a filter to the MassTransit message processing pipeline, which puts the message context on a thread-safe stack. The serialogue reads the context from the stack and adds these additional properties to our log objects.

In MassTransit insert filter

And in the Serilog configuration we add an enricher

Since the application that receives the message from the RabbitMQ queue has access to all the properties of the MassTransit envelope, we can use the obtained correlation identifier inside the consumer application and also transfer it further along the entire call chain,

As a result, our logs began to contain CorrelationId not only within the same service, but also when interacting with other applications.

So, the resulting logging system in .Net applications allows us to easily log logs from completely different microservices – even those that work through the message broker. And with the help of Elasticsearch we can quickly and conveniently analyze logs by building the dashboards we need in Kiban (an example is shown in the picture to the post).

Of course, in this form, logging will not cover complex options for the interaction of your services and various external systems, but putting this in order at the very beginning of the development of the project is one of those things for which you yourself will thank you more than once.