Observations about Web Client Technologies

Thundering Web Requests: Part 6

This is the sixth and final post in a series of posts exploring web services related technologies. It documents the observations about custom web clients that were implemented in Elixir, Go, and Kotlin and used to evaluate web service technologies.

Observations based on Minimum Time per Requests

While analyzing the behavior of various web service implementations, I observed that minimum time per requests for different web clients were starkly different. So, I plotted their minimum time per request against Actix-Rust service implementation as it was one of the most performant and reliable service implementation in the previous experiment.

Minimum time per reqeust (ms) against Actix-Rust service implementation [Click to enlarge]

In the graph, in all network traffic and concurrent requests configuration, the minimum time per request for HTTPoison-Elixir client increases from ~200ms to ~650ms as the number of concurrent requests increase. The extent of change from the lowest number of concurrent requests to the highest number of concurrent requests was ~450ms. In comparison, the minimum time per request for Vertx-Kotlin client ranges from ~900ms to ~1050ms and the same for Go client ranges from ~0ms to ~100ms.

With HTTPoison-Elixir client, the minimum time per request increased steeply with the number of concurrent requests. Similar sort of increase was not evident in case of Vertx-Kotlin and Go clients. This suggests both Vertx and Go do a better job of juggling concurrent requests than HTTPoison. A likely reason for this behaviour of HTTPoison is that, Hackney, the base library of HTTPoison, uses a pool of sockets to service requests. The default size of this pool of sockets is 50 and the evaluated web client implementation used this default. Increasing the size of this socket pool could help with manage concurrent requests better in HTTPoison-Elixir client.

Even in the least number of concurrent requests and lowest network traffic configuration, the minimum time per request for Vertx-Kotlin client was ~900ms. This suggests the web client support in Vertx is significantly slower than the web client support in Go and HTTPoison.
However, the minimum time per request for Vertx-Kotlin did not increase steeply with the number of concurrent requests. This suggests the slowdown is likely due to the activity in Vertx stack both before each request is being dispatched and after the corresponding response has been received.

Observations based on Every Time per Request

Out of curiosity, I also plotted the histogram of a random sample of time per requests from 100 and 2500 concurrent requests and various network traffic configurations for each client against Actix-Rust service implementation. The size of the sample was the same as the number for concurrent requests.

Histogram of time per requests of Go, HTTPoison-Elixir, and Vertx-Kotlin clients at 100 concurrent requests and 2 (Blue), 6 (Orange), and 10 (Green) network traffic configurations. [Click to enlarge]
Histogram of time per requests of Go, HTTPoison-Elixir, and Vertx-Kotlin clients at 2500 concurrent requests and 2 (Blue), 6 (Orange), and 10 (Green) network traffic configurations. [Click to enlarge]

In the above graphs,

  1. For the Go client, all histograms are right/positively skewed. This suggests the time per request was lower for most requests (unless the server was less busy when serving requests from the Go client). Hence, Go web client is preferable if lower time per requests is desired.
  2. For HTTPoison-Elixir client, at lower concurrent requests configuration, the histograms skews from right to left as the network traffic (response payload) increases. This suggests the time per request increases as the network traffic increases. Since network traffic is independent of the extent of concurrency, this suggests HTTPoison is slower in handling large response payloads (unless the server was busy when serving requests from HTTPoison-Elixir client).
  3. Compared to Go client, HTTPoison-Elixir client provides lower variation of time per request at higher concurrent requests configuration, i.e., ~0ms to 3500ms vs ~700ms to ~2200ms. Hence, HTTPoison is preferable if low variation in time per request is desired.
  4. For Vertx-Kotlin client, at both concurrent requests configurations, the shape of the histograms is unaffected by the change in network traffic. While this is good, it is insufficient to prefer Vertx over HTTPoison and Go as HTTPoison and Go provide better time per request for most requests.
  5. For Vertx-Kotlin client, the histograms are left skewed at higher concurrent requests configuration. This suggests Vertx isn’t preferable at higher concurrent requests if lower time per requests is desired.

Note about Failures: No failures were involved in the executions that led to the above observations. Even so, be sure to factor in the effect of failures while considering the above observations.

Summary

Go and HTTPoison with Elixir are good choices to implement performant web clients.

Venkatesh-Prasad Ranganath

Written by

Software Craftsman / Researcher. Posts mostly about Software Engineering, Systems, Performance, Scale, and Security.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade