The Social Responsibilities of Interaction Designers

Design is dangerous. When you strip away the surface layer, design is about changing and managing perceptions, opinions, and behaviors: design is a method of influence. And influencing people, as it turns out, is alarmingly easy.

Justin Secor
Aug 12, 2014 · 6 min read

The science behind interaction design is partially rooted in the study of human behavior, which is a well-researched and documented branch of psychology. Behavioral scientists have charted much of the human brain’s complexities and have armed designers with simple, effective tools for generating influence, and you see the results everywhere. The advertising world uses behavioral levers like scarcity, anchoring, and loss aversion to get you to buy more stuff, political strategists design behavioral levers into campaigns to win congressional seats and presidencies, and interior designers do the same thing to keep gamblers within the walls of casinos (and losing money). As our agendas shift from making things useable to changing human behavior, our influence as interaction designers has never been stronger.

This is a serious problem.

Why? Because if design is influence, then designers are dangerous people. On one hand, design can push culture forward, illuminate paths to prosperity, and eliminate bad habits. On the other hand, design can hinder progress, rob people of wealth and happiness, and foster addictive behavior. Influence is an ethical dilemma, and designers are right in the middle of it, knowingly or otherwise.

Unfortunately, avoiding unethical practices in design—like those documented as dark patterns—is only half of what interaction designers ought to do. People with influence have social responsibilities beyond not being evil. Often, these responsibilities are codified as principles, and in some cases—as it is for politicians—those principles are sworn to be upheld.

When I became a designer, no one asked me to swear to use influence for the right reasons, no one trained me to know what the right reasons were, and I was given no ethical principles to follow. Some smart folks are starting to think about the ethics of software design (see Jonathan HarrisThe Ethics of Code) and others are trying shake us out of a stagnant, narcissistic design culture (see Design Culture is a Frozen Shithole by Cole Peters).

So let’s give it a shot. Below is what I think our ethical principles should be: the four social responsibilities of interaction designers. My list isn’t perfect, and it reflects the biases that I’ve accumulated in my own career, but it’s where I’m going to start.

When Don Draper-esque advertisers started using sex to sell products in the golden era of advertising, maybe they didn’t realize the ramifications, but they were significant (and most likely irreversible). Sexual imagery is used everywhere in America to push products, and the cultural effect is unsettling: researchers have observed a significant negative effect on self-image in adolescents, a strong correlation with the development of eating disorders in women, and an increase in depression and anger among teenaged boys. Advertising designers aren’t the only people to blame for this effect, but they definitely contributed.

Similarly, when the Web entered it’s “2.0" phase (remember when we used that term?), interaction designers somewhere built commenting systems that fostered fast, anonymous, unmoderated posting by anyone with an opinion to share. For better or for worse, their cultural effect is now obvious: not only are comment threads on sites like YouTube filled with vitriol and slur-filled bickering, but people have grown accustomed to being digitally invincible, going so far as to post fake bomb threats on Twitter. Some of us are learning the truth about “invincibility” and anonymity the hard way, and the unfortunate status-quo of Internet discourse is best summarized with a warning: “don’t feed the trolls.

How will the interactions you design affect the future? How will new user interface patterns change cultural perceptions? You won’t be able to predict the future—or all of the complexities of your cultural effect—but you have a responsibility to plan for it to the best of your ability.

We know a lot about our own users: we know what devices they use, the browsers they prefer, whether they have javascript enabled. Knowing these specifics helps us build better interactions, but they can also lead to exclusivity: we design for a specific user, passive users become disinterested, then the homogeneity of our users increases. What if we’re intentionally designing for a specific and small group of people? Exclusivity isn’t necessarily bad, but it definitely is when we’re excluding users that want to use our product yet cannot: people with disabilities, for example.

If Google Analytics were to tell me that the proportion of traffic using a screen reader on my app was zero, it would be easy to rationalize ignoring the accessibility concerns for blind users. Most designers wouldn’t, because we know the importance of accessibility, and are often legally mandated to. But consider other types of users that you may be ignoring by just looking at current user analytics: people who can’t afford fast devices, people with cognitive disabilities, people who aren’t digital natives. Socially responsible design balances the benefits of exclusivity and inclusivity with the best of its ability, and in the circumstance of little or no user insights from research, it errs on the side of inclusivity.

One unfortunate fact about the Web today is that it consumes a lot of resources. That’s an issue that leads to bigger problems, like fossil fuel depletion, pollution of the atmosphere, and climate change, so every kilowatt spent should be put to good use with as little waste as possible. Energy used on data centers alone amounts to more than twice that of the city of Paris (approximately 76 billion kilowatt hours in 2012), and that’s just to keep the Internet’s engines idling.

Design doesn’t just impact the environment on a large scale, it has a micro effect on your users in a lot of ways. Interface design, for example, has an independent effect on device battery life, influences perceptions about energy use, and adds its own contribution to your users’ utility bills. A bloated interface is an energy-inefficient one: not only are you sending more digital assets over the wire (estimated at an energy cost of 4.3 microjoules per bit of data), but your cognitive overhead may translate to a bigger commitment from your users, which could then translate into pixels staying illuminated longer, CPUs and GPUs working harder, and fans spinning faster to keep things cool.

Other products—take kitchen appliances, for example—are designed to be energy efficient, and in some cases, efficiency can be the prime selling point. Interfaces and interactions should be no different! Design them to be light and be aware of the energy footprint: these characteristics make them better user experiences and better for the planet. I’ve written and spoken about this in greater detail, if you’re interested in techniques for building energy-efficient interfaces.

I saved the most important principle — and sometimes the hardest to uphold — for last. No doubt you apply the Golden Rule to your own life in many ways:

“Never impose on others what you would not choose for yourself.”Confucius (475 BC–221 BC)

This rule is the codification of empathy, persisting across cultures and transcending bias, history, and worldview. It is the most elemental force that guides our ethical compass: the magnetic pole of morality that pulls the needle and governs our relationships. It cannot — must not — be ignored in interaction design.

You’re a designer. It’s your job to move users through the completion of tasks, which establishes a human relationship with your user. Who has the upper hand in that relationship? You do, because the user knows nothing about you; but their own biases, preferences, and limitations are completely exposed to any designer well-studied in behavioral science. That quiver of behavioral influences you can draw from? Make sure you’re comfortable aiming any of them at yourself. And wherever possible, choose to correct harmful biases instead of manipulating them, inform your user instead of cultivating ignorance, and remedy bad habits instead of exploiting them.

This is tough work. You may have to compromise on other goals, set yourself apart from your colleagues, and walk away from jobs that lead you down the wrong path. Using your influence, in the right way, is absolutely the hard road. Research will be your way-finding ally, and empathy will point you north.

Opower UX

A collection of articles written by the Opower UX team.

Opower UX

A collection of articles written by the Opower UX team.

Justin Secor

Written by

Designer, artist, and a third thing.

Opower UX

A collection of articles written by the Opower UX team.