Understanding the Connected Home: Concepts / Controlling privacy

Peter Bihr
9 min readJul 5, 2016

TL;DR: We wrote a book. It’s about connected homes (aka smart homes) and how we can design them in a way that makes them great to live in. It’s for practitioners. It’s available online via Gitbook (for free), and extra nicely Kindle-formatted on the Amazon Kindle store (this is also a way to support the publication). This here is one of several excerpts, with slightly adapted formatting for Medium.

Privacy is complex, contextual, cultural, fuzzy. It is hard to get right, especially in the context of connected things. Really hard.

Our vocabulary has not evolved as quickly as technological and societal change has. We are essentially stuck — for now — with crutches, metaphorically speaking.

We try to describe and solve problems of connected privacy with terminology and metaphors from non-connected privacy. This is as tricky and often misleading as trying to depict a four-dimensional figure in three-dimensional space. It’s possible to a degree, but we simply are not, as a society, fluent in doing it.

Yet controlling your privacy and making informed decisions about the factors that impact it are absolutely essential going forward. How can anyone decide which product/service/network/company to invite into that unique safe space of their home?

It might be helpful to think about privacy-related choices as consent rather than just preferences.

We need better privacy user experiences & a new vocabulary

We see two things we need to address the issue of navigating privacy better:

  1. In the medium term, we need to evolve a new vocabulary — terminology and metaphors — to help us think about the complex arising issues that make up privacy in the 21st century.
  2. In the short term, we need to design privacy-related interactions to be much, much easier. Controlling your privacy should be as intuitive (or at least simple) as possible. Even if simplifying means losing granularity, that would be better than the alternative of making ill-informed decisions.

Dimmers and other metaphors

One exploration of simplified privacy interfaces was a direct result of our work on the first version of this book: The Privacy Dimmer.

The Privacy Dimmer at The Good Home. Milan, April 2016.
The Privacy Dimmer at The Good Home. Milan, April 2016.

For the Milan installment of The Good Home project, we created a series of physical explorations. The exhibition consisted of a number of speculative designs and conceptual prototypes. Two of them are directly relevant to how we could think about privacy and UX.

Privacy is a spectrum

A light dimmer allows you to control the lighting of a room gradually. In the same way, a Privacy Dimmer would give more gradual control over the amount of sensing/smartness/data processing going on in the home at any given time. Privacy is a spectrum, and it is highly contextual.

Privacy is a spectrum. We need to explore how to best express and refer to this spectrum.

The Privacy Dimmer

The Privacy Dimmer would accommodate this gradual dimming by mapping the various systems and subsystems (sensors, analytics, etc.) that make up a connected home to settings on the privacy spectrum.

If the privacy was turned up to the max, no object would sense or process activity in the house — the house would be “dumb” again. It would be in a state like all homes until ten years ago. Turned all the way the other direction, all tracking, sensing, processing would be active, whatever this might mean for any given smart home.

The Privacy Dimmer allows for a gradual dimming of privacy vs sensing/tracking.

For example, if your home had pressure sensors in the floor, movement tracking for the automatic light switches, a facial recognition video system for home security, a smart thermostat that tracks your presence through a smartphone app and an Amazon Echo that listens for your voice commands, you might want all these systems on in the morning when you get ready for work and the kids ready for school. That way the temperature would be just perfect, the lights on, the weather and traffic predictions at the ready. The smart home would do what it’s built to do — make your life easier through automation.

However, in the evening you might want to enjoy a quiet dinner with your partner. Over dinner and a glass of wine you might not want any alerts and notifications, any lights switching on and off, any device listening to you, waiting for orders.

Turning the privacy dimmer would switch off all of these systems one by one. Starting with the Amazon Echo, the security camera, the movement sensors, even the smart thermostat. You’d have full privacy once more.

The privacy keyfob: Controlling your privacy while out and about

The dimmer is installed in the house or room. It’s immobile. But what happens if you move about? You still should be able to make informed decision about your privacy and control the level of sensing you participate in.

As an on-the-go extension of the Privacy Dimmer, imagine a keyfob. It’s a little device you carry on yourself at all times. Rather than switching on or off any systems in its environment (which would be hugely invasive and aggressive), it reads a signal from the connected home (or retail space, or public space…) and compares the amount and types of sensing and data processing against your personal preferences as stated in that keyfob.

How the dimmer and keyfob would work together.

If the environment’s sensing and tracking is compatible with your privacy preferences, the keyfob might give a subtle visual cue that everything’s ok. If more sensing takes place than the wearer is comfortable with, it gives an alarm signal.

Why just a notification and not an active sensing blocker?

We’re convinced that a purely technical solution to a social problem rarely works. In this case, the notification would put the wearer into a position to make an informed decision. In their current context, they might be comfortable sharing more than they would otherwise. They might ask their host to turn up the privacy/turn down the sensing, or they might choose to leave. It’s a social solution to a social problem.

Like the Privacy Dimmer, the keyfob would also map preferences to types of services on a spectrum (even if it would likely be a rather crude system). No sensing/full privacy would be a self-sufficient description, but it might be hard to find any place that matched that description, at least in urban centers.

Turning up sensing agreement (by turning down privacy requirements) on the keyfob’s spectrum, we might agree to types of sensing and data analysis like registering our presence via our phone’s wifi signal. Then (one more notch down on the privacy scale) we would allow for CCTV, then (another notch down) agree to allow for an always-listening device like certain types of smart home hub or controller.

For example, we might choose to set our default to agree to a minimum of sensing. But in a close friend’s home, we might not mind their smart home hub listening in on the conversation, so we might adjust our privacy settings after a little alarm signal there rather than leave.

On the other hand, if we are in a retail space or in, say, a colleague’s or client’s apartment we might not want to share too much and adjust the keyfob setting accordingly to know when sensing takes place.

Privacy Machines: Exploring privacy through the metaphor of time

At the Mozilla Open IoT design sprint in Berlin, my group worked on Privacy Machines Inc, a fictional company’s privacy products. (Our group consisted of Rachel Uwa, Martin Skelly, Vladan Joler and Peter Bihr. You can find photos and more descriptions of the various fictional privacy machines at thewavingcat.com.)

Privacy Machines at the Mozilla Open IoT design sprint, Berlin April 2016.

One of these machines was the Wayback Machine. It’s a little box that explored how to control privacy in the home through the metaphor of time.

Concretely, it would switch off various media-related technologies one by one — much like the privacy dimmer — in reverse chronological order in which they were introduced. Go back to 2013 and most smart home products would have stopped working. Turn the dial back to 2004, and you would have lost access to Youtube, Facebook, Twitter. Go back to 1996 and your internet access might be turned off.

Privacy Machines at the Mozilla Open IoT design sprint, Berlin April 2016.
Privacy Machines at the Mozilla Open IoT design sprint, Berlin April 2016.

The Wayback Machine plays with two notions:

First, the time angle makes it immensely relatable. While there are many issues with this — it fosters nostalgia, it’s technologically and historically tricky, it doesn’t necessarily make a lot of sense — it does help start great discussions. Because it removes the technological barriers and works with simple metaphors and examples we found that most people would much more happily engage in this kind of debate than if you approached from a perspective of privacy, policy, or surveillance.

Second, it underlines that media and communications technologies have evolved from one-way (broadcasting) to two-way (phone) to systems that are tracking the users’ behaviors through cookies, traffic analysis, meta data, etc.

Media and communications infrastructure since the advent of the modern web has turned from something watched or consumed into a system that stares right back at the user. Connected homes are extending this right into our living rooms.

What’s next?

All of these examples are speculative, non-functional prototypes. Nevertheless, we do believe they might offer valid starting points for real products and services.

As connected homes become a mainstream reality, we need to design and build products that make it easy to make informed decision. Users should be empowered and in control of their privacy, rather than relying companies to determine the settings for them.

As these products are built, our policies need to adjust as well. Rather than playing catch-up (and failing to do their job well as they shoot at a moving target) or killing off innovation through over-regulation (which would likely just drive the development of connected home products outside our jurisdictions into less-strictly regulated regions), these policies need to be sensible and forward-looking. This is no easy task, and law makers will need all of our support.

In the mean time, the brunt of the burden is on the shoulders of UX designers. As a group, they might be the best positioned professionals with both the skill sets and the mandate to ensure users are empowered to control their own privacy.

The authors

Peter Bihr (@peterbihr) explores the impact of emerging technologies. He founded The Waving Cat to apply these insights through consulting, R&D, conferences and publications. As a strategy advisor, he helps organizations large and small to excel in an environment shaped by digitization, connectedness and rapid change.He co-founded many emerging technology conferences including ThingsCon, UIKonf and Cognitive Cities Conference and co-chaired Interaction16. He also co-founded the Good Home Project.

Michelle Thorne (@thornet) leads Mozilla’s exploration of the Internet of Things. She serves a professional learning community seeking to shape IoT with openness and user empowerment. Previously, as Mozilla’s Director of Web Literacy Programs, she supported thousands of professional educators and activists to teach and advocate for the web. Michelle has a dedicated interest in open practices and design, curating the Mozilla Festival, exhibiting with The Good Home, writing for Open Design Now and co-authoring the book An Open Web.

> Go back to the beginning of this series.



Peter Bihr

What’s the impact of emerging tech, IoT, AI? @thewavingcat: Research & strategy. Co-founded @ThingsCon for a responsible #IoT. Mozilla Fellow. @ZephyrBerlin:👖