Load Testing socket.io Web Applications and Infrastructure
Tutorial for load testing your socket.io WebSockets server
Are you shipping a scalable real-time back end? This article provides a tutorial for load testing your socket.io based WebSocket server, to simulate production load and identify performance bottlenecks or scalability issues.
socket.io is a framework built on top of the WebSocket protocol, which upgrades an HTTP connection using an Upgrade: websocket
header, initiating a WebSocket handshake.
This provides a persistent connection between a client and server where both parties can emit messages at any time, without the need for the client to poll the server.
socket.io adds additional functionality on top of this, such as increasing browser and device support of your application by providing an option to fall back to Ajax requests when using a client which does not support WebSockets, or a client on a network which restricts the upgrade HTTP header or protocol changes.
Setup
To follow along with this tutorial, you can either clone and run the demo repository locally (suggested), or you can use the version hosted by me on Heroku.
If you choose to run the app locally, you will need to clone the repository and install the required dependencies using npm. The npm start
script will start the app on your configured PORT
, or 8080 by default.
First, let’s get familiar with the application. Once running, the application should work as below.
The main features to note are being able to send a realtime message between users, and see a count of the current participants in the chat room.
Below is an example of two different users sending and receiving messages in realtime — great!
Load Testing
Now that we’ve set up the application, we can start our load testing. For this, we’ll use the Artillery Community Edition tool, which is a free and open-source load testing and functional testing toolkit.
To install the Artillery.io toolkit globally:
$ npm install artillery -g
Next, we need to create a YAML configuration file, which will specify the test configuration and scenarios.
config:
target: "wss://socketio-loadtest.herokuapp.com"
socketio:
transports: ["websocket"]
phases:
- duration: 10 # Run scenario for 10 seconds
arrivalCount: 20 # Create 20 virtual users per scenario
scenarios:
- engine: "socketio"
flow:
- emit:
channel: "add user"
data: "John Doe"
- emit:
channel: "new message"
data: "Hello! {{ $randomString() }}"
- think: 5 # do nothing for 5 seconds, then disconnect
In the above configuration file, we specify our target URL, and Socket.IO engine. We configure our phases
section to specify the number of virtual users we will emulate and the duration of the test. Initially, these are low numbers.
In the scenarios
section, we specify what WebSocket events we would like the artillery.io client to emit to the server.
For this simple example, we will add a new user to the chat, using the add user event in the demo application, then post a new message. After five seconds, the user leaves which disconnects the WebSocket.
To run the above script, we will use the artillery run
command.
$ artillery run load-test/simple-test.yaml
If we view our application in a browser (either locally, or at https://socketio-loadtest.herokuapp.com), virtual users can be seen joining the chatroom and leaving messages.
The above simple script can be modified as you wish to increase the number of users in a scenario by tweaking the configuration values. For a more advanced test, you can find more verbose scenario examples below.
Assertions
Now that we have the tool setup, we can set the task to fail based on certain conditions, which is great for a CI environment.
We can choose to set a maximum allowed latency in milliseconds for min
, max
, median
, p95
, and p99
, or a maximum failure percentage rate. These can be configured to test again certain business rules, or SLOs.
To configure the above, we add an ensure
entry to our configuration. Further examples can be found in the artillery.io documentation.
config:
target: "ws://localhost:8080"
ensure:
maxErrorRate: 1 # fail if error rate exceeds 1%
max: 500 # fail if max response time exceeds 500ms
Advanced Configuration
In the advanced example, dynamic variables are used to simulate a scenario which is closer to how the application will be used in production (real data). This is done using the faker npm module, inside a custom processor which allows us to run our custom code.
The load phase is configured to increase the arrival rate from 10 to 50 users over two minutes, followed by 10 minutes at 50 new users per second.
config:
target: "ws://localhost:8080"
ensure:
max: 500 # fail if max response time exceeds 500ms
maxErrorRate: 1 # fail if error rate exceeds 1%
socketio:
transports: ["websocket"]
processor: "./custom.js" # set a processor for dynamic variables
phases:
- duration: 120
arrivalRate: 10
rampTo: 50
name: "Warm up phase"
- duration: 600
arrivalRate: 50
name: "Sustained max load"
scenarios:
- engine: "socketio"
flow:
- function: "getChatData" # load variables
- emit:
channel: "add user"
data: "{{ name }}"
- emit:
channel: "new message"
data: "{{ greeting }}"
- think: 10 # stay connected for 10 seconds
- emit:
channel: "new message"
data: "{{ goodbye }}"
Run artillery using the advanced-test.yaml
file
$ artillery run load-test/advanced-test.yaml
The above is a snippet of the ramp-up phase defined in the configuration file, where the arrival rate of virtual users increases from 10 to 50 users over two minutes.
The advanced scenario example can be tweaked in many ways depending on what type of test you want to run, such as how many concurrent connections can be handled, or maximum number of users before performance begins to degrade.
Recap
Load testing our Socket.IO applications is an important step in the application lifecycle, to learn:
- The number of concurrent WebSocket connections the application can support.
- How the application handles failure.
- Determines if the current infrastructure is sufficient for the expected load.
- Gives confidence in the system and its reliability and performance.
- Helps ensure that degradation of any component in the stack doesn’t lower the state of security.
Next in the Socket.IO blog series will be a tutorial on how to debug memory leaks in Socket.IO applications. If you have any ideas on similar topics you’d be interested in, feel free to get in touch with those ideas.