Server-Sent Events in .NET

Implementing The Most Elegant HTTP-Based Push Mechanism

Roko Kovač
4 min readNov 18, 2023

Introduction

To achieve near-real-time communication over HTTP, a few techniques emerged, such as Polling, Long Polling, or even Webhooks.

However, all of them come with limitations, and none of them maintain a consistent connection with instantaneous server push.

In this article, we will explore the most elegant HTTP-Based server push mechanism — Server-Sent Events and implement it in .NET 8.

Server-Sent Events vs Long Polling

SSE is often compared to Long Polling and can be seen as an alternative.

While both techniques are HTTP-based and provide seemingly the same end result, there are a few distinct differences.

Similarities

Both SSE and Long Polling:

  • enable near-real-time communication over HTTP/1.1,
  • are one way (an HTTP/1.1 limitation)
  • require a specific implementation (cannot be used with existing HTTP endpoints)
  • keep an HTTP connection open

Differences

Just like Long Polling, SSE maintains an open HTTP/1.1 connection. The main difference is in what happens when new information arrives.

While a Long Polling request will finish as soon as new information arrives, acting like a regular, long-lasting request, an SSE request stays open until the client or the server decides to finish it, and the new information is simply written to the buffer.

This makes it suitable for high-frequency updates, which is one of the problems with Long Polling.

Put simply, SSE is a technique that allows real-time server push over HTTP.

Server-Sent Events

Server-Sent Events Caveats

There are a few caveats to watch out for when using SSE.

Request Timeouts

Most web servers have a request timeout configured. For example, Nginx defaults proxy_read_timeout to 60 seconds. Depending on your needs and restrictions, you might need to increase this or implement client-side retry mechanisms.

Additional Server Configuration

Depending on the server, you might need to configure various caching and buffering mechanisms. For example, an app behind an Nginx reverse proxy will not work if chunked_transfer_encoding, proxy_buffering, or proxy_cache are enabled.

Browser connection limitations

Most modern browsers have a limit of 6 active connections per domain. All requests beyond this limit will stall. Keep this in mind when opening an SSE request, since stalled connections can slow down your website significantly.

SSE Message Format

A full SSE response format, supported by all modern browsers, will look something like this.

Headers

The only required header is the Content-Type: text/event-stream, which indicates to the client that it’s an SSE response.

HTTP/1.1 200 OK
Content-Type: text/event-stream

Body

A body consists of multiple messages, with data prefixed by “data: ” and event names prefixed by “event: “. You can send both named or unnamed messages with one or more lines, but each line must be prefixed by “data: “

event: itemCreated
data: {"name": "bucket", "Price": "114.50"}

data: Here's an example of a multi-line message
data: returned piece by piece by a
data: chatbot application.

Implementing Server-Sent Events in .NET

Let’s implement a simple SSE endpoint in .NET 8.

if you just want the code, you can find it on my GitHub.

app.MapGet("/", async (HttpContext ctx, ItemService service, CancellationToken ct) =>
{
ctx.Response.Headers.Add("Content-Type", "text/event-stream");

while (!ct.IsCancellationRequested)
{
var item = await service.WaitForNewItem();

await ctx.Response.WriteAsync($"data: ");
await JsonSerializer.SerializeAsync(ctx.Response.Body, item);
await ctx.Response.WriteAsync($"\n\n");
await ctx.Response.Body.FlushAsync();

service.Reset();
}
});

The code is pretty straightforward. While the request is not canceled, we wait for the new item, serialize the message and write it directly into the response buffer in the standard format.

We also make sure to include the Content-Type: text/event-stream header.

The ItemService class is a Singleton service that provides a Task Completion Source that could get triggered from a separate process, eg. an event handler or a call from another service.

After writing a new item message to the buffer, we make sure to reset the Task Completion Source so that we can wait for the next one.

Here is a simple ItemService to demonstrate the implementation.

public record Item(string Name, double Price);

public class ItemService
{
private TaskCompletionSource<Item?> _tcs = new();
private long _id = 0;

public void Reset()
{
_tcs = new TaskCompletionSource<Item?>();
}

public void NotifyNewItemAvailable()
{
_tcs.TrySetResult(new Item($"New Item {_id}", Random.Shared.Next(0, 500)));
}

public Task<Item?> WaitForNewItem()
{
// Simulate some delay in Item arrival
Task.Run(async () =>
{
await Task.Delay(TimeSpan.FromSeconds(Random.Shared.Next(0, 29)));
NotifyNewItemAvailable();
});

return _tcs.Task;
}
}

If you want to see it in action alongside other transports, check out this Chat app.

Implementing an SSE Client

All browsers support the standard EventSource API, which makes using an SSE endpoint a breeze.

const eventSource = new EventSource('http://localhost:5006/sse');

eventSource.onmessage = function (event) {
const item = JSON.parse(event.data);
console.log("New item received:", item);
};

Conclusion

I have shown you how to implement the most elegant server push over HTTP technique— Server-Sent Events.

Albeit rarer than other techniques like WebSockets, it still has very strong and recent use cases, such as ChatGPT using it to stream chat responses, or SignalR using it as a fallback when WebSockets aren’t available.

In the following articles, we will move beyond HTTP and explore the WebSocket and gRPC protocols to implement two-way real-time communication.

If you’re interested in reading about those, consider subscribing.

Any thoughts? Comments are welcome!

--

--