Touch: An Interactive Installation

Christian Silver
6 min readNov 24, 2015

It was at the beginning of last year when I was looking for interesting things to create in my visual arts course. I had switched over towards the end of last year from music (IB, so a two year course) and I had finally found myself feeling at home with this new subject. I was ready to start thinking of how I could make things a bit different.

At the time I was getting inspired by what people were doing in data visualisation. A particular video by Numberphile titled Pi is Beautiful was of particular inspiration. It demonstrated the work of Martin Krzywinski who took the form of pi and visualised it in some beautiful ways. In the image below you’ll see a circular manifestation of pi with links showing the sequence of the digits of pi. The video does a good job of explaining this so do give it a watch if you’re interested.

A visualisation of pi. Credit: Martin Krzywinski

I thought it was incredible how you could turn all of these numbers into something that looks so amazing. To be fair, it is a tad arbitrary — the circular representation of pi would look very similar with a sequence of random numbers. Even so, there is a beauty to the fact we can create a distinct, visual representation of something so complex.

Let’s visualise some data

What data is there to be visualised? I could have just plucked some from the web but I didn’t really feel close to any statistic at the time. I could have tried another mathematical constant but that would have been a tad cliché. Besides, something more dynamic, more real time would have been far more interesting — yes?

I came up with a simple concept. A bunch of people have their smart devices out and there is a screen in front of them. Touching your screen will have a response on the viewing screen. I wanted to leave it as a bit of a mystery, so there should be no instructions as to what people should do.

I decided to keep things really simple for the first time around so I could prototype without much work. I wanted to keep it web based so it could be really accessible (nobody wants to download an app for this sort of thing) and because I was more comfortable with the technology.

The code really isn’t worth delving deep into because it was so simple, but here’s a summary of the technologies involved.

  • Node.js as the webserver
  • Express.js as the web framework
  • Socket.io to power websockets for realtime communication
  • Raw canvas to render the touches on the viewing screen

Getting touches from phone to viewing screen was really easy to implement with socket.io and it was fast. I could have an EC2 instance in Sydney and from Auckland there was no perceivable delay between me putting my finger down and something appearing on screen.

When it came to rendering stuff on the screen, I wanted to keep it simple as I had been before. I came up with some basic rules as to how the data would be represented:

  1. Each session (person) would be identifiable by a colour
  2. Each touch on the screen would be represented by a circle of that person’s colour
  3. Touches that are close to each other will connect with a line drawn between them
  4. You cannot connect to your own touches
Simple rules manifested on screen

This took a day or so to implement. After this point I decided to test it out at a few classes of mine and also at a conference I was speaking at. The response was really quite impressive. I did very little explaining except giving people a URL to visit on their phones and there was always a bit of a “what is going on?” moment. First people would tap their screen and either not notice any update on the viewing screen, and this would continue for a minute or so. Then people would start to work things out:

“Look! There’s something there!”

“Ooh! That’s me, I’m moving!”

“I’m purple… and I’m touching green. Who is green?”

“Wait, we can put more than one finger down?”

After that, everyone would go mad and move their fingers on their phones/tablets/laptops around and then you’d just need to watch it all unfold. The shapes formed with random motion were quite organic (that is, when the connection was good).

When you get lots of people on board, it starts to look interesting

It occurred to me that people write a great deal of time to make things look convincingly natural and organic. I had achieved this effect rather effortlessly by using other people. It’s not something I have given much more thought to but using elements of the real world to power our visualisations could make things look a lot better. Tie this in with some machine learning and you could end up with a real winner. I wouldn’t be surprised if a lot of people are already doing something similar.

This didn’t change that much since I first showed it off to an audience. I made a few bug fixes, tweaked some fonts, added some fades but the core idea of it has stayed exactly the same. I had no idea that people would catch onto it so easily and so enthusiastically. The room really gets buzzing when they try it out — a group of primary schoolers got so excited they started to use their tongue as a pointer so they could get 11 touches on the screen.

The buzz leads to communication, and the communication leads to coordination. The shapes become a team and touches move in tandem or in recognisable shapes. The first example I saw of this was when I showed it to my design class early in the year — they organised all of the touches at the bottom of the screen and made them go up the screen together and then back down together, oscillating back and fourth with gusto.

The heart couldn’t be caught on a screen recording but we did get it on film.

My personal favourite has to be when I got together with a group of friends to do some recording for my submission for the course. They like to impress so they ended up forming a heart. It took a heck of a lot of coordination and some impressive finger work (you couldn’t form two links in the chain) but they got there in the end.

The source code sat privately for a long time, back then I wasn’t quite aware of how open source worked. But in writing this, I’ve decided to move it to GitHub. I haven’t touched the code so it’s still a bit rough, it hasn’t moved past a prototype. If anyone wants to have a look/tinker, just head over to Pinpickle/phone-touch-experiment on GitHub.

Below you’ll see the video that I made when I had some friends over. I think it does a better job of illustrating how this thing works than any words I can come up with.

I still bring it out every now and again for events, but I think it would be awesome to have this running 24/7 across the world. There’s no reason the people interacting have to be in the same room. In fact, having people communicating solely through their touches could lead to an interesting experiment.

--

--