Streaming http responses in AngularJS

Bartosz Polnik
4 min readMar 6, 2018

--

In my last article, I was digging into streaming data from a server. By streaming, I understand returning data as soon as it’s possible without incurring any additional delay. I analysed and implemented three approaches:

  • returning data as a plain list, without streaming. Served as a baseline for performance comparison
  • returning data using Server-Sent Events with streaming
  • returning data by utilizing Jackson streaming

Returning simple list and streaming using Jackson are equivalent from the consumer point of view. In both methods, we request data using ajax and process responses similarly. The only difference is in the latency, time when we receive specific pieces of information. In the case of a list, we expect to get a huge payload at the end of backend processing. Streaming may deliver first bytes of data faster due to its incremental nature.

In this article, I will show you how to implement incremental streaming to display first bytes earlier and improve user experience for people using slow mobile networks (fast variants of 3G).

To do this, I will implement two http clients:

  • HttpClient utilizing angular $http service to deliver data as soon as it’s fully read and parsed by a browser
  • StreamingHttpClient returning specific nodes of your json response using Oboe.js, without waiting for the final bytes of response.

Using these clients, I will measure performance and perceived user experience of fetching 5 000 rows of data with and without network throttling. Let’s start! First part - HttpClient!

HttpClient

AngularJS has built-in http client, but not I’m happy with it. It went the journey from success and error callbacks to Promise based api. Pure promises don’t provide any support for cancellation, so $http owners resorted to passing an additional parameter in a configuration object. This parameter is called timeout, which also reveals its second nature - controlling time of requests. This is not an elegant solution, to say the least.

I’m much more fond of RxJS api because it’s extremely powerful and in many cases that’s the only api you’ll ever need. Using it for http is not novel, Angular 2 does the same, so I finally decided to wrap $http service with Observable interface. This will be particularly useful later, as in StreamingHttpClient promise based api doesn’t make sense when we want to emit multiple values.

Here’s the source code of HttpClient:

Let’s inject our client to a component which will be used for performance testing:

The common template for all tests shows time elapsed since invoking http client to the processing of the last item. This is not the same as the time required to render all results, but can serve as lower bound of that.

Apart from displaying processing time, I’m also presenting fetched items to show the time when user begins to see first results on the screen.

StreamingHttpClient

This client similarly to HttpClient will use Observable api. However, instead of emitting list of all rows, it will emit only one row of data at a time, but do it earlier - as soon as data reaches a browser.

To effectively use StreamingHttpClient we need to craft a pattern, a simple description of specific part of json response that contains important for us nodes. These nodes will be provided to us one by one and our application logic must be able to handle them in that form.

That’s all about implementation. Let’s see its performance in action!

Streaming 5 000 rows with fast 3G

HttpClient

Http client blocked processing response until all of it was available. We were able to see results on the screen after about 7.5[s]. The most of the time prior that was spent on idling because we had no data to operate on and thus nothing could be rendered.

StreamingHttpClient

Streaming was able to deliver first rows earlier and after about 1.7[s] we could analyze first rows of data on the screen. Total utilized resources were higher than in HttpClient, but this allowed for faster rendering and earlier finishing of processing at 6.4[s].

Streaming 5 000 rows without network throttling

HttpClient

Now, without network throttling situation is much different. The is no more idling, just pure rendering and scripting. All done in 1.4[s] and that’s the time required to see earliest and all results on the screen!

StreamingHttpClient

In the throttled network being faster for streaming was simpler due to network latency. What if we remove almost all of it?

I would say that streaming is still much faster. First rows were shown after 0.6[s] and although the latest ones were presented at 1.5[s], which is 100[ms] worse than in HttpClient, I find it hard to believe that a real human creature would wait for the last rows to appear before analyzing the first 5k of them.

To conclude, it really pays off to stream!

That’s all for today. Thanks for reading! Looking for next posts!

All resources used in this post are available in my Github repository https://github.com/bartekbp/blog/tree/master/data-streaming/frontend.

--

--