Case Story: Behind the IAM Weekend 18 website by DVTK & Jack Wild (Part 2/2)

In this 2-part story we introduce the ideas, process and talented creatives behind the new website and visual identity of IAM Weekend 18, commissioned to London-based studio DVTK, and developed in collaboration with web developer Jack Wild and the new IAM designer, Conor Rigby who visually translated our 2018 theme: The Subversion of Paradoxes.

[Part 1 / Part 2]

DVTK: Kim Boutin & David Broner

Since launching our studio in 2015, we have been crossing paths with the Internet Age Media team a few times. Firstly, through their close collaborators at UAL Futures, then a workshop they did at the Tate Britain.

When Andres and Lucy (IAM co-founders) contacted us last summer to help them building a bespoke interface for IAM Weekend 18, we did not hesitate for one moment to accept!

The first stage was for us to take ownership of the annual topic, ‘The Subversion of Paradoxes’, as Andres and Lucy were expecting us to come up with a meaningful digital translation of it. Not a piece of cake, if you ask me!


After a few online discussions and some research, we came upon Sherry Turkle’s concept called ‘Alone Together’. In a nutshell, our understanding of this paradoxical headline is that we, Millennials, are all connected on social media but alone IRL, feeling more comfortable with online interactions than real relationships.

This acknowledgement made us feel like a doomed generation. Indeed, in the history of our civilisation, each generation tends to consider the following less smart and that, you know, things ain’t what they used to be. From our perspective though, the Internet can also be a rich and unique place where for instance, we have built relationships with people who live in different countries, yet are sharing the same mindset as us. This actually makes us feel together.

Thus, the paradoxical concept ‘Alone Together’ could be subverted if taken positively, when focusing on ‘Together’ instead of ‘Alone’. Somehow, when connected, the Internet brings us together into a virtual space where we can all be at the same time, whatever our location might be.

the Internet brings us together into a virtual space where we can all be at the same time, whatever our location might be

Moreover, these virtual places sometimes let us collaborate. Maybe scenius
 could even happen online, buoyed by the Internet? Collectively, yet separately in time and space, people can gather together on the Internet and operate.

We then wanted to represent this participative space, and make people feel that they are never alone online. This is how we came up with the idea of representing the cursors position of everyone browsing the page concurrently — which react and interact in real-time.

Jack Wild: developer

Technically, this posed quite a challenge, we were aiming to be able to comfortably accommodate up to 150 simultaneous users, each of whom could potentially move their mouse position at 60+ frames per second. It was also important that the experience wasn’t disappointing for users who had a slow connection. We also had to make sure it would run on our modest server resources!

Having worked with WebSockets in the past, I knew that they were the first point of call for such a brief, as they offer bi-directional low-latency communication and are supported by all modern browsers.

First attempt was to mirror exactly the mouse movements of each user, I threw together a quick prototype to check the feasibility of the idea.

I wrote a simple client-side app which simply emits the mouse position of the user to the server using Websockets (powered by The server was a simple Node.js app running on a Digital Ocean $5/month instance, which receives the mouse position from every connected client, before broadcasting it back out to all other connected clients. The other connected clients then display the position of the other clients on screen.

Success! It worked. But, being ambitious, we thought that we should really aim for a higher concurrent user count than just the 3 of us (just in case!), and good luck with sustaining far into the double digits of users each emitting to the server 60+ times per second, and broadcasting back to every connected users each time an event comes in (that’s ~720 events every second with just three users, which rises exponentially). Some serious throttling and grouping is needed.

First easy step was to throttle the mousemove event on the client to 30 times per second, and to batch the broadcast events on the server and send them out 30 times per second to everyone. This didn’t really have that much effect on the outcome (to be honest even at 60FPS it wasn’t that smooth in the first place), and was a slightly pointless exercise anyway, as at 30FPS, we still wouldn’t be able to accommodate many users.

I decided a better approach was to set the throttle to the minimum we could get away with (client and server-side), and then see what we could work with. I semi-arbitrarily decided on 16 times per second (based on the fact that humans can generally perceive film as low as 16FPS without it appearing to be a slideshow, and on some light research about WebSockets stress-testing).

Oh dear. The results were (expectedly) disappointing — surely this wouldn’t stand up to the scrutiny of the tech-savvy visitors to IAM Weekend 18. Thank god for easing 1–0–1 (current = current + current — target * easing) — you can shove that on pretty much anything and it looks good. This produced an acceptable result, it was nice and smooth and didn’t hammer the server too much.

However nothing is perfect, not even easing 1–0–1 — after conducting a very ‘thorough’ study of real mouse movements of real people browsing websites in my studio, the movements replicated in the prototype sometimes slightly unnatural — often the speed wouldn’t remain constant, or would flip between very fast and very slow, and the avatar moved in unusually straight lines… if a user’s moved the mouse in a circle or an arc, the other users would see more of a hexagon pattern! The movements also felt too controlled when coming to a stop.

At this point I decided to stop guessing and do some proper testing, for which I used the excellent, which has support baked in. The test was simple: ramp up from 0 to 10 new users per second over sustain at 10 new users per second for as long as possible. Each user would join, then roam around the screen for ten seconds, then stop moving and watch. I found that everything remained pretty smooth until about 50 concurrent users moving around, at which point it got a bit laggy (not too shabby). At around 150 concurrent users, the Node app on our poor little cheapo dev server finally crashed. At least it tried.

We thought we could do better. I decided to use the position of the eased avatar as a target instead of the final product, and added a new object which would use physically realistic simulation to seek the eased target.

Now we’re getting somewhere! The target is constantly moving, easing between the actual position of the mouse as values get updated, and the avatar moves naturally towards the target, with realistic arcs and acceleration and deceleration. I played around with the max speed, max steer and ‘acceleration/deceleration zone’ until I got to a result which seemed consistent with the real mouse movements in my ‘study’.

Now I’d done this, I decided to try to reduce throttle the events even more — after all less is more. I halved the rate of the events, down to 8 events per second, without any real noticeable difference in the motion of the avatar. We ran the tests again, and (not surprisingly) we doubled the amount of users we could accommodate to 100 before it got laggy. And, I’m not sure what the maximum concurrency is because, my MacBook gave in before the server did.

Let’s see if we can get enough visitors to the live site to break it!

Check out the final result at

Words by David Broner & Jack Wild