SignalR is a nice API provided in the .NET ecosystem to handle realtime scenario. With the emergence of .NET Core, it becomes even easier and faster with the removal of jQuery dependency and the simple API definitions in both the client and the server.
If you already wrote ASP.NET applications with SignalR, you generally followed the same pattern to create js event handler on the frontend and .NET
Task on the backend. This is a simple and working model but it forces you to do 1–1 API call.
Since ASP.NET Core 2.1, we can now use SignalR streaming which is a new feature that allows us to push data to or receive data from client/server in a much more efficient way. In fact, the streaming scenario is well suited when you have a long-living Hub in the client.
For those who want to get ahead, here is the official documentation about streaming in .NET Core: https://docs.microsoft.com/en-us/aspnet/core/signalr/streaming
So, when it comes to asynchronicity and even more importantly to streaming, I rather prefer to use
Observable whenever I can because it fits even more in the current world of frontend frameworks, notably by using
The promise of the SignalR streaming would be to send and receive events handled by these
So, listening to realtime events can give us the ability to filter them, group them and use many operators based on more specific scenario.
And the other side of streaming would be to send data from a given
Again, there are a vast number of scenario whether it is receiving or sending events.
Now that we see the interest in the client-side, how’s that going on in the server? You can also handle streams of data in the server almost as the same way we can do in the client.
It does not look like
Well, in the case of client-to-server stream, you can fill one or many
Observable from the
IAsyncEnumerable. It seems like the smartest way to do.
But for the server-to-client stream, there are is a major thing we should discuss first: backpressure.
Backpressure, what’s that?
If you desire to consume an
IObservable<T> instead of using a
ChannelReader<T>, you must know that you can be exposed to errors due to backpressure: the fact that the consumer can be slower than the producer which can lead to an inconsistent system.
To prevent this error, there is a way to change the streaming behavior by customizing a
ChannelReader<T> from an
IObservable<T>. There are currently 2 main scenario you can encounter:
ToNewestValueStream method is made for the
ignore all previous values scenario. So, you will always send/receive the latest value as soon as possible even when the consumer is slower than the producer. Previous data can be skipped in favor of the latest one.
Use this scenario when you are in a I don’t care about not so fresh data scenario.
ToBufferedStream method is made for the
buffer all previous values scenario. So, you will always send/receive the values from the producer, even when the consumer is slower than the producer. No data will be skipped but it can take a while before having the latest data.
Be aware that it will create a buffer of all the entire data not delivered, which can drastically increase the memory footprint.
Use this scenario when you are in a I care about every single data that was sent scenario.
So, if you want to simplify your architectural pattern of realtime streaming, you can use:
IAsyncEnumerablefor client-to-server streams
IObservable<T>for server-to-client streams
If you are ready to choose between
ToBufferedStream methods, note that these methods are not offered by default by the SignalR package but here is the link to the nuget package
Extensions to use Reactive observable in SignalR streams
Also, the source code is maintained here on GitHub:
Since ASP.NET Core 2.1, we can now use SignalR streaming to push data to or receive data from client/server. By design…
And if you want to see the code by yourself, here is a sample app written in React + .NET Core: