Hello Owl: How voice technology improved the life of a woman with MS

Smart Design
13 min readMar 12, 2019

--

By John Anderson

66-year old Susan has been living with multiple sclerosis for twenty years. BBC Two’s Big Life Fix challenged Smart Design to create a voice solution for Susan that would let her continue to live her life independently. After researching different platforms and voice assistants, the team chose Amazon Echo and created a layer around the product — an ‘accessibility jacket’ — that helps her complete tasks on her own.

The Smart Design team went through rounds of iterative experimentation, prototyping and designing to figure out how Susan might navigate voice technology to suit her needs. In just four weeks, the team delivered a personalized voice technology solution that now exponentially improves her daily life, helping Susan complete the everyday tasks that so many of take for granted, such as listening to music, calling her family, and changing the TV channel. The Amazon Echo Dot unit at the heart of the solution was also given a friendlier appearance, adapted into the shape of an owl, Susan’s favorite animal.

Smart’s Ruby Steel, Jasper Dekker, Will Merrill, and John Anderson talk us through how it happened — including a disastrous first test session.

“Her own devices were working against her.”

The team met together for the first time to understand what it’s like living with advanced MS and how Susan’s current products were failing to meet her cognitive and physical limitations. Oh and, we had to get used to being on TV.

Ruby Steel, Senior Design Strategist at Smart Design: I was approached by a production company that was developing Big Life Fix for BBC Two. I was lucky enough to fit into whatever kind of role they were looking for because a lot of what I did at the Royal College was very human-focused. I was really interested in design specifically for a good impact on someone’s life, especially vulnerable people, so a lot of my portfolio is in that area.

Jasper Dekker, Senior Interaction Designer at Smart Design: Ruby spent a fair amount of time with Susan to get to know her, her environment, and her patterns. And of course, she has a lot of limitations. So, for all the technology she has in her house, it’s all, in one way or another, completely failing her. What we did was investigate the technology available off the shelf to see what we could bring to Susan.

Ruby: Before I met Susan, I’d been given some footage and a bit of upfront research and context about her… there were loads of things that she was struggling with. One of the things that really struck me was how much the devices in her own home were kind of working against her.

Will Merrill, Senior Industrial Designer at Smart Design: It was a bit weird because there was this TV crew kind of lurking behind us, and it was hard to put that out of your mind while talking to Susan. But it was really interesting to see the technology gap between Susan’s world and what was currently possible.

“I’m sorry, I didn’t get that?”

Voice was clearly a path for her physical needs early on, but how does Alexa stack up conversationally? Well, that’s a different problem…

Ruby: I asked her daughter “have you considered an Amazon Echo?” [Susan] didn’t know what it was. I mean, why would she? She hasn’t been able to use a smartphone and she hasn’t been able to use a computer in many years.

Jasper: Products [like Amazon Echo] available on the market now just didn’t quite do it for her, from a conversation perspective. We felt that we could use available technology and hack our way around a custom experience that we’d be designing from the ground up.

John Anderson, Technology Director at Smart Design: There’s a barrier to entry for even early adopters for voice. A large challenge was to remove that barrier.

Jasper: It quickly became clear that the smart speakers currently on the market are far from being “human” experiences. She would simply say, “oh, I want to listen to some music,” expecting [Alexa] to come back with options, or some sort of guidance. You know, anything — way more specific than she would normally expect when asking a human being for music. So that was the interaction layer that we saw failing, rather than the technology layer.

John: The bizarre theory we came up with was that in order for us to be flexible to Susan, we needed Alexa to be less flexible from an entry point perspective as well as an answerability one.

Jasper: I believe there is a phenomenon called the Paradox of Choice — that humans have so much choice, which is usually communicated as a very good thing in a world of thousands of options. But actually, that’s when people don’t make a choice at all, because it’s [all] too much.

“How can we humanize the machine?”

Now that the team had chosen a voice path, there were two key next steps: tailor the conversation to Susan and create something physical that Susan could connect to at an emotional level. Fortunately, the latter was staring at us right in the face.

Ruby: How could we get her emotionally connected? With something that she can trust and knows is patient? We noticed that she had [ornamental] owls absolutely everywhere and [they were] something that she talks about in an affectionate way… pointing them out to us.

Will: We visited one day and just looking around we saw all these ornaments or models of owls… It wasn’t actually going to be an owl, to begin with. It was going to be a birdhouse!

John: We were originally going to build a birdhouse to house all of the electronics, but it was outvoted by an owl who she had more of a connection with. It’s super interesting to humanize the machine to the point where she was willing to be patient because, for her, it felt like something she could interact with emotionally.

Ruby: [When] we gave this technology the form of an owl, it was quite extraordinary how much more patient she was.

“Play 60’s music”

Susan wants to perform tasks that we take for granted every day. How could a small personalized menu lead Susan to use Amazon’s core features where so many products couldn’t before? To add more complexity, how could we present everything together in front of a TV crew in three weeks?

Will: Looking at the current products, they all rely on some kind of visible interactive elements or lighting up depending on its mode or its function. But it’s really quite subtle and I think with Susan’s eyesight being what it is, we wanted to really amplify that. We got to that through roleplaying; we had some really lo-fi prototypes that we’d made… we just printed out [on] paper that we would hold up, depending on what mode it was.

Jasper: Susan watches TV, she turns lights on, she listens to the radio, she makes phone calls… in theory, that’s all these smart speakers were supposed to help you with. But of course, it has a certain syntax, a certain set of rules that you need to remember because you can’t see them, and they’re all slightly different.

So, we started with an Amazon Echo. You can write Skills for it, but that’s so limited today that we learned early on that it wasn’t going to work. So, what if we have our own voice that just whispers to the Alexa what to do? Simple microphones and speakers placed close to an actual Amazon Echo that then would act on all of the commands that our interface would give to it.

John: We then pivoted to Susan working through an Alexa Skill which would then hit a cloud and come back and tell the Raspberry Pi to say it back. It was still a little quirky because there were two voices involved.

Ruby: We were solving problems every day… that was really exhilarating and exhausting at the same time. I’d say we were all emotionally drained by that point because the other thing is that you know when you are designing something for just one person, the aim is focused on that person and just you.

Jasper: I ended up hacking the Echo even more … trying to fetch the status of the actual device because it has a little ring on the top that can show colors… so that’s why we decided to take that light and use it to elevate the feedback and feed it forward into something that was much more clear to her. So, we ended up going for a physical object — an owl in her case, because she loves owls. And the owl can light up to tell her a little bit more about the system, to really augment the status or the functioning of the system. That was done through a light that was just a little bit more than what the Echo does. It breathes… it just goes on very slowly and then it goes off, it’s very soothing and relaxing in a way.

Will: We loved how the owl lighting came out even though we were so short on time…we were excited to show Susan what we had.

Ruby: Susan was so gracious and kind to us. You know, we just wanted to give her this gift. Every time it worked, we were even more elated than usual… I remember the night before we went to see her. It was late, we were tired, but we were so hopeful and excited.

Jasper: We were so thrilled. Like OK, awesome… we’re going to show Susan the big reveal and there’s going to be camera crews… we had a very early start, 5 am Saturday morning, in December. So, it was all dark. And then… things started to break down.

Ruby: The anticipation was built and built, and then we came back at this moment where you present this thing with quite a lot of theatre. Here is this creation that we’ve made… and it just absolutely tanked. Like, none of it worked, and it broke in whole new ways that we weren’t even expecting. Just incredible. I couldn’t believe it…

Will: And it was just so disheartening… I’m sure there’s plenty of footage of Ruby really, really defeated because it just wasn’t working…

Ruby: We wanted to reach the finish line and celebrate that moment. And Susan was so patient with it … but it confused her more… it was very frustrating because I knew it worked the day before … it was just different acoustics, different physical environment, even a different kind of a Wi-Fi network … all those little things that you just cannot anticipate in your bubble of a design studio.

John: I remember where we were supposed to get beers that day…

Ruby: Awful.

Will: I think we all just thought, we can’t end with this. And then I guess we had to pick ourselves up and regroup.

Ruby: We were trying to say [this] should be an enabler technology. It should be better to be living with MS now than it ever has before. So, we went to see a fellow big life fixer who works at Imperial College London. And he was like, ‘Wow, you guys, you’ve done a really great job. This is great.’ But [then] he says, ‘Why is there a Raspberry Pi in there?” I think it was just one of those moments where someone from outside of the project is able to walk in and say, wait a minute… what if you used two Alexas? That really changed everything.

That was a kind of eureka moment… What it meant was that it became much more of a software challenge than a hardware challenge, and we could just use the very advanced, very good microphones and speakers in the hardware… not try and reinvent the wheel in that sense but actually let Amazon deal with the hardware. And then try and push the software. And we then have to go even further to try and find a way for these two things to talk to each other because it wasn’t what they were designed to do.

Will: And then we kind of came back with our tails between our legs. But it felt way more stable than it did before.

Jasper: Yeah of course it was. It was our final chance… fortunately, it went very well.

Ruby: The second time was actually kind of unbearable because the first time was such a such a blow… coming back to the same thing with cameras again, with Susan again, was actually even more nerve-racking. And essentially, I have never been so relieved… it actually worked.

Ruby: Bear in mind that before we started this project, for her to change the channel on the TV took her two minutes. That’s ridiculous. And then suddenly she started very quickly as she started to play around with it and that’s perfect because she was very confident… she was at ease with it. She trusted the system and she very quickly figured out a way to interact with it. And it was awesome. It was so good to see the footage from the film that day.

“Can you do me a favor and call me whenever you feel like it?”

After the team accomplished our goal with Susan’s owl, we took a step back and did a retrospective on what it was like designing for one and how it impacted her life.

Jasper: It was very interesting, and I think good to focus on one particular person… we really had to go back to the foundation of what this technology is about, and what does it do really well?

Will: I was excited to reveal a product that I was quite happy with… I thought, at least I made something that [Susan] engaged with! Kind of trepidation combined with a bit of excitement.

Jasper: It was designing from scratch in an area that was uncharted, really. There were no solutions ready or patterns that we could use… because this whole voice smart speaker thing is so new, and everybody is raving about it, but nobody has really nailed how to make it accessible for everybody or how to use it in the first place.

All the big companies — the Apples, the Googles, the Amazons — they don’t have the answers. They are as experimental as we are… we do it on a very small scale, but they put products on the market not really knowing whether they’ll nail it or not and then wait for feedback, see how people use it… and that’s a really long process… and yet we found ourselves having to go through that process in about four weeks, which was quite challenging!

Will: Designing for one person… really creating this narrative which spoke to needs and desires. In the consultancy world, you wouldn’t have the option to design something like that, because it’s so personal and it’s so unique. For me, that was the really amazing thing about this project.

Ruby: I think that’s [what] I passionately believe, that ‘designing for one’ is a really valuable way of designing… you get to a deeper level of understanding than I’ve ever experienced when understanding a group of people. It’s the journey you go on with that person, I think… something that I don’t want to ever lose in a project — wherever I’ve got the opportunity to spend more time with one person, I will always want to do it.

John: Designing specifically for Susan allowed us to go much further than traditional proof of concepts. I was able to trim a few corners from a coding perspective because the content was specific to her. If this was a pilot with 100 people, we would have spent much more time trying to make the conversation tree dynamic adhere to people’s personal preferences. We remained hyper-focused on our goal of how voice accessibility could help someone like Susan, and in that respect, we felt like it was quite successful.

Jasper: Exactly. It‘s exciting, but it’s also very much a technology push for these companies that make these products — they’re not answering a need. Our usual design process is to understand people and find insights or unmet needs that we can then answer using the technology available to us.

Ruby: I think one of the really wonderful things was the time I said to Susan, ‘Could you do me a favor — could you call me? Just whenever you feel like it.’ And a few weeks later I was at home on a Saturday, and my Alexa rang, and that was just so wonderful. That was the moment where I really felt like we’d done what we set out to do…we just had a chat about life and general stuff. It’s what it’s all about… I don’t want ever to stop doing that kind of thing because it just matters more than anything else… that’s amazing.

This oral history was captured and produced by John Anderson. The Project Susan team includes Ruby Steel, Jasper Dekker, Will Merrill, and John Anderson from Smart Design. Check out our video recap of the project here.

--

--

Smart Design

We are a strategic design company that helps people live better and work smarter. www.smartdesignworldwide.com