Catching Ghosts

Matt Cooper-Wright
6 min readJan 21, 2017

How we used sensors to unearth and capture real behavior for design research

Recently, at IDEO London, we’ve started to think more creatively about how to collect and use data for research during the design process. It’s easy to track user behavior when we work on digital design projects; the tools are built in. But what if we had those capabilities outside the digital space? On a recent project, design researcher Jenny Winfield and interaction designer Miha Feus found a way to collect user data in a physical/digital prototype, by using sensors to evaluate a redesigned hotel workspace for Crowne Plaza hotels. Here’s a little bit about how they did it, and how approaches like this one could change the future of design research.

What were you trying to accomplish in the redesigned space?

Jenny Winfield (JW): Business travelers are usually required to work their hardest when they travel — they need to win a pitch or clinch a deal. The stakes are high. So we came up with a concept we call “Ignition,” a new meeting and working space in the hotel lobby that makes it easy to work while on the road. It combines new spaces for group creative thinking or heads-down work, a work-friendly menu of healthy food options, a new staff member to handle business needs, and digital tools providing on-demand services.

Last January, we were ready to test our concept live. Per usual, we would be there in person for a few weeks to do interviews and gather observations, but we also thought we should measure how our new solution performed when we weren’t there. That’s where the sensors came in.

How did the sensors help you make your case?

JW: The pilot was split into two phases in each location. First, we gathered data for the baseline — testing the lobby in its initial form — then the redesigned space. We wanted to make sure we improved on all fronts — not just sales metrics, but also, traffic through the space, and the number of people using each type of space.

What kind of equipment did you use to build your sensor setup?

Miha Feus (MF): We used cameras to measure foot traffic and activity across different areas of the space, as well as sound sensors to gauge the volume level and infer the atmospheric buzz. And we also used seat and proximity sensors to measure occupancy and dwell time on different pieces of furniture. The actual sensing components (pressure and proximity sensors, and microphones) were off the shelf, but we built the brains for the sensors and the communication platform to gather the data from around the space wirelessly.

How did you rig up all the sensors, and what software did you use to record that data?

MF: Most of the sensors were held in place with double-sided tape. We taped cameras overhead and sound sensors on the ceiling, and placed seat sensors under seat cushions. All the data from the sensors was sent to an Arduino Yun, which served as a hub, routing the data via Wi-Fi to a Parse Database. The Arduino Yun also retained a local copy on an SD card, in case there were any Wifi issues.

To analyze data from the cameras, we used a software tool called Camlytics. It was able to capture the number of people passing through the space automatically, without the need to record visual data. This way, we didn’t infringe on people’s privacy.

The proprietary hardware we created around the sensors was also built with privacy in mind. All the sensors only had transmitters, which means they couldn’t be hacked remotely. But even if someone had gained access to them, the chips and transmitters we were using were unable to process and transmit sound data at a level where someone could listen in on a conversation. The seat sensors worked similarly. The data we were getting only told us whether they were occupied.

JW: Ethically, we are committed to protecting our participants, whether they are explicitly taking part in our research, or they happened to wander into it. In any case, our main goal was to capture overall usage levels, so there was no reason to map or track specific users.

What was the hardest part about making this sensor system work?

MF: The most challenging thing was to design something that can run unsupervised for weeks on the other side of the ocean, something we’ve never done before. But we also had trouble with our “huddle spots” — bespoke pieces of furniture we created for this project. Because they were hard plastic stools without cushions, we couldn’t use pressure sensors to measure their use. Instead, we had to rely on proximity sensors, which are power hungry, and not the most reliable.

What useful takeaways did you learn from the sensor data?

JW: The new concept was definitely popular. We had a 79% increase in people coming through the space in our first test site.

Okay, so more people walked through your space but did they stop?

JW: The number of people who sat down in the space went up by 80 percent, predominantly at times when those areas are usually very quiet (the early afternoon, for example). So not only did we increase use of the space, timestamps in our data showed that we did it at a time when it used to be empty. That, combined with data that showed food orders were being made at a time when the regular restaurant was closed, was a great selling point.

The sensors also showed us that the average dwell time in the huddle spots and nooks — spaces designed for four to six people work together — was 20–35 minutes, which confirmed our idea that these pieces should be used for quick perches and catch ups. It sounds simple, but it was huge. It not only meant that the design of the space itself was right, it also confirmed the importance of having drinks delivered within a 5-minute window, or having technology that empowers users to make an order in less than 1 minute. If visitors only hang around for 30 minutes, everything has to be really fast.

What’s the most exciting potential for sensors in future design research?

MF: As data collection methods and analytics tools become easier to use, we’ll use them as prototyping materials, much like we are using foamcore and hot glue at the moment.

JW: The ability for us to learn about user behavior remotely is most exciting to me, given that it’s something we haven’t been able to do before. With sensors, remote research can mean much more than learning how people are using a digital tool that we built. Being able to track patterns and see how people naturally interact with our prototypes (both physical and digital) when we’re not there is huge. Then it’s about looking at the resulting data in creative ways and telling a more nuanced story about the type of impact we expect our ideas to have out in the world.

This article was first published on the IDEO Labs blog, thanks to Miha, Jenny and Elise for their work on the original post.

--

--