Finding the limits of responsible design

Hidden fault lines and how to spot them

Albert Shum
Artefact Stories
6 min readJun 20, 2019

--

Blurred fault lines result as modern systems, platforms, products, and experiences proliferate.

The Responsibility of Design is an event series exploring trust, ethics, and the future of design. In partnership with the Common Cause Collective and AIGA Seattle, Artefact hosted Albert Shum, CVP of Design at Microsoft’s Experiences & Devices Group, to join us in the studio for a conversation with the Seattle design community. The spirited discussion centered around the limits of digital design, and what designers can and should do when working at the cutting edge of transformational technologies. Here, Albert Shum expands on the question: Where is the Line?

In classic product design the limits are well-defined. The margin for error is clear, apparent in the breaking point of physical material. You test for this. You set the limit and put your product through the ringer to push it two times, five times, ten times further. This sets the margins to ensure something is safe to use by anyone even in the most extreme conditions. Everything eventually breaks, but that moment should live within the limits of responsible design.

Think of safety improvements to something like car windshields, from standard glass to laminated, then tempered, and eventually the enhanced version we have today. Knowing that windshields inevitably break, the industry took care to protect passengers best they could. A century of testing, improving, testing, improving, to keep people safe as automobiles proliferated our way of life. In the case of today’s technology — and I mean this as the current state of super-connected devices, platforms, and apps that keep us continuously plugged in — we don’t have a century’s worth of incremental improvements. We barely have a decade. The speed at which digital experiences have become ubiquitous means we’ve had little time to consider the consequences. Only now are we seeing places where we’ve crossed the line, perhaps because we were moving too fast to notice.

Zero to sixty

That line — that point of inflection where things begin to break — has become a cause for debate among modern designers. As more “product design” evolves into the realm of digital experiences, the margin for error is constantly adapting and much less obvious. Again and again, where’s the line? is the question posed to designers and technologists, and it seems the line is always moving. Where’s the line between want and need? Suggestive and declarative? Engaging and addictive? Privacy and freedom? Government and big tech? There has always been a real-world impact to what we create, but the consequences of opacity in product development have become more global, more glaring, and more complex. We’re not just testing in extreme conditions. We’re seeing new extremes emerge.

This is where designers can help to slow things down and consider the human at the other end of the experience. Arguably, the line is very clear. You know where it is. What you do when you come upon it is another thing.

Define the line

There is plenty of design that goes beyond the limit and in most cases, the product creators just weren’t aware of it. This isn’t meant to condemn the industry more than it is to challenge it. If we see now that the future of technology relies on responsible creation, do we need to continue at a breakneck pace? Again: are we creating things people want, or that they need? Are we suggesting certain behaviors or dictating them? Are we giving people the freedom to choose? Are governments and tech companies working together in the interest of the public? Are we deferring to the human, or the tech? Asking these questions not just once but at every stage of product development has arguably become the responsibility of design. If there is a frontline between the tech and the customer, it is defined by our power to spot patterns that could lead to imbalance. We can see the darkness and shine a light on it. There’s a bit of clairvoyance in responsible design: seeing the line way out in the future, and pushing the design beyond its limit. Testing every possible outcome, good and bad, in a safe space.

This is the idea behind shipping minimum viable products (MVPs) in technology. MVPs are incomplete; meant as an opportunity to flight ideas, learn, and adapt in real-time. When we see a failure, we correct it. When we notice that we’re potentially changing behaviors, we analyze it. When we discover an additional need, we build for it. These tests are carried out in controlled environments where designers and researchers have the explicit permission of our customers to hear their thoughts and learn from their lives. Or, consider scaling this practice using artificial intelligence. Is it better to have millions of simulations find the limits to your designs, or to have millions of people trust you to get the design right?

There’s no easy answer, and these practices aren’t just about finding bugs to fix. It’s a whole new way to find the line, using methodologies that, in part, find us wearing the black hat. We have to push beyond expected behaviors to test the limit of not just the technology, but the limits of the people who use it, and how it affects them.

In practice

Our job is to see the line, cross it, question everything, and build it up again, over and over again until a product can be trusted in the real world. This takes time. Asking “what could go wrong?” should never be a hypothetical question, but a design requirement to answer. Giving teams the time and space to ask these questions and play out myriad scenarios is integral to responsible technology. You have the power to draw the line yourself, rather than waiting for your product or experience to stumble far past it, beyond your control. A disaster happens very quickly, but how you end up there is a slow burn.

If we want to continue as true product designers in the digital age we need to be more intentional. Humans are impacted by design, plain and simple. Know that there is a limit and we are responsible to it. We can’t assume that people will adapt or technology will simply improve itself. In fields like medicine, architecture, and agriculture, consumer safety regulations define the experience. A catastrophic event resulted in harm and strict policies followed. Why are we slow on the uptake in the digital field? Are the consequences more complex, or just harder to define? If technology causes harm, what are we doing about it? As a designer, here is how you can hold yourself accountable and start to create meaningful change:

  1. Learn from the past
    One of the biggest challenge as a designer — and as a human — is to acknowledge when things have gone wrong and learn from those mistakes. We’re conditioned to be perfectionists in a world where nothing is perfect. Moreover, as modern designers, we’re being trained to design with components and libraries that literally repeat history. Are those components sound, from a human perspective? Has anyone questioned the UX flows we keep reusing? Are we going back to our customers and asking what’s changed, rather than relying on historical information? What you can do today is, again, slow things down and think critically through the next experience you create. Don’t just do “what always works” — bring the line from the past into the implications of this new future. Those who do not learn from history are doomed to repeat it.
  2. State the obvious
    We are all afraid of stating the obvious, particularly in a professional setting. Will I sound unintelligent? Do I just not understand the technology? Maybe they already thought through this scenario. When in fact, common sense is rarely common practice. People circumnavigate potential roadblocks by nature. The more obvious a scenario, likely the more devastating the unintended consequence will be. Don’t think of it as being obvious, but being rational. If you see something in a product review that could be misused or misunderstood by customers, let’s hope your teams would want to know that. It’s much more difficult to regain trust than to simply do right by the customer the first time. The trick here is to voice your insecurities — they’re doing more damage in your head than spoken aloud. Start by saying “this might be obvious, but…” to be clear and let people know where you stand.
  3. No open secrets
    Similar to the conundrum above, there can be a diffusion of responsibility on product teams. The problem is compounded when you’re designing at scale. Surely everyone sees what I see, you think. Someone will say something. But of course, hundreds of people thinking the same thing doesn’t mean hundreds of people saying the same thing. This is the phenomenon of open secrets, and it’s extremely damaging to both culture and product. Assume nothing and opt to speak up. If the audience or the room or the conversation is intimidating, find a friendly ear after the fact over email or one-on-one. It is your responsibility to say what everyone’s thinking, particularly as the voice of the customer.

--

--

Albert Shum
Artefact Stories

CVP of Design at Microsoft. Leads a collaborative team creating the future of cross-platform experiences across work, life, and school. Views are my own.