Value-Sensitive Design

in the context of interaction with a virtual assistant

Shambhavi Deshpande
Thoughts on Design… and more
5 min readOct 14, 2019

--

BIT for FitBit

BIT is a virtual assistant concept for FitBit on which I am working with Stef, Jisoo, and Ekta! I have had a lot of fun thinking around the ecosystem of FitBit and conceptualizing the role of a VA in the life of its users. We are designing an active, playful companion to be with you on your fitness journey.

Keywords of Value-Sensitive Design:

based on my understanding of the four points of emphasis from Batya Friedman’s ‘Value Sensitive Design’ (pg. 4):

Process: Influencing the design of technology early in and throughout the design process; Analysis: Bringing critical analyses of human values into design and engineering methodologies; Adaptability: Embracing a broad spectrum of human values that evolve w.r.t. Context; Enrichment: Broadening and deepening widespread design and related methodologies like HCI, anthropology, psychology, sociology, software engineering.

When I reflected upon these points, I got reminded of a brainstorming session very early in the design process of our VA BIT — in which we thought about the inclusion of different global accents (if not languages).

We also thought of inclusion in terms of accessibility towards different modes of communication.

Two Hard Problems

Complexity: Technology and society can’t just wait for value-sensitive design solutions. I was telling someone about the IxD course and how I am designing a voice-based virtual assistant. He said to me that voice and virtual assistants were “in” two years ago. Are you seriously so far behind?! I explained to him how we are focusing on emotions and social context and somehow managed to defend the worth of my project (and my premium tuition fee!). But it is true that the technology of voice and virtual assistants couldn’t have stood still for two years, waiting for CMU Design graduates to come up with emotion-sensitive VAs which are apt in social contexts! Alexa and Google Assistant have been out there and learning on their own.

Robustness: Accounting for a wide range of parameters like values, contexts, scale, etc. is a never-ending web of thoughts. When it comes to being robust, in the initial stage of a design project, we were able to think of a lot of scenarios. We were dreamy-eyed and wanted to design an angel of a VA. Then we approached a deadline with clearly set expectations. And then we filtered down our ambitions to focus on a tiny portion of the interaction of one single user with her VA in a watch.

Way Forward

Thinking about the human experience on all these levels seems like a starting point of learning Value-sensitive design. I am happy to explain how in the IxD project, we have been thinking around the levels of Individuals, Small Groups, and Public Places.

Stakeholders in BIT’s Ecosystem

BIT would directly affect the users of the Fitbit watch. It would also directly affect the users of Fitbit apps in Android/iOS phones and Android TVs. These users would be interacting with BIT on a day-to-day basis, to check sleep statistics or record nutrition intake. BIT could be helping them with workout instructions or recording calorie burn. To be concise, our intention for the direct stakeholders is to get an active, friendly companion for their fitness goals.

BIT would indirectly affect close people in the day-to-day life of these users. Most importantly, BIT could have an indirect effect on real (not virtual) personal fitness trainers. As such, BIT is intended for someone who doesn’t have the luxury of a personal trainer. However, even if a BIT user does have a real personal trainer, BIT’s role in their life might overlap with the trainer’s. However, this is not supposed to be another AI replacing jobs and making people redundant! The way we designed this system is to facilitate interactions between trainers and trainees. For example, we thought of a Zumba instructor’s persona, who could get to know the overall energy levels of her class through BIT, and decide the workout intensity accordingly!

Conflicting Human Values in BIT’s Design

I read through table 2.1 in Friedman’s text multiple times. Sometimes, all of it feels like common sense. Sometimes, I find it like a summary of ‘Moral Science’ books from school. And sometimes, I feel like something is missing—I am not very sure what — I’ll try to articulate as we move forward.

In my judgment, our VA concept or Fitbit as a service struggle to balance the values of privacy and human welfare. On one hand, Fitbit knows where you go, what you do — 24x7. Your data is stored and encrypted securely, but it might not be enough for everyone. On the other hand, your data is vital to the service this device is providing. For instance, your pulse rate at night is being analyzed by this technology, so that you realize how to catch better sleep.

Two instances mentioned in ch. 1 of the book: Data privacy: How awareness began in the late 1990s, people were concerned about cookies, and then informed consent became common; Sustainability: How a team proposed solar panels on a heritage building Dipoli in Finland for the value of sustainable energy, but due to concerns about a change in the roof’s silhouette, they updated the value to “Preservation yet modernization”.

In relation to data privacy, I remembered a discussion upon the key characteristics of this VA, where “private” was a key attribute of our virtual assistant. Fitbit can collect a lot of personal data about you, which can be worrying. The VA could play an important role in reassuring you about how your information is encrypted and is being used to help you.

Very recently, we were thinking about how the VA could send you little nudges to stay active or eat healthily. However, there is a very fuzzy line between a friendly reminder versus an annoying taunt (or a creepy snide)! We asked our friends to gauge what kind of nudges would be acceptable. Here is a snapshot of the responses for one of the questions:

Feel free to share your thoughts too with us (3–4 mins): https://forms.gle/LsH9wKkxWWXy7XAN9

--

--