Ethics

Designing for dichotomies

Josh LeFevre
Dichotomy
6 min readDec 10, 2018

--

This article is an exploration of the dichotomy of ethical design vs value design. This article is intended to be more of a short thought piece surrounding this topic to spark future discussion.

“Ethics” — Digital collage created by Josh LeFevre

Working Ethics framework

I explore the idea of ethics as a philosophy that focuses on right and wrong behaviors for individuals and within communities. Based on my reading and the etymology of the word ethics, I see ethics containing three components.

  • Beliefs/Ideas: Beliefs are the truths or principles that an individual or a society have and strive for. These principles typically stem from religious, philosophical, or scientific perceptions of the world.
  • Morals: Morals are, as defined by ethicists, right or wrong behaviors, which are based on agreed upon beliefs. To be moral begins as commitment to self to live and act in a certain way regardless of the situation or location and is often what defines one’s character.
  • Values: Values are one’s personal assessment of a belief’s merit or importance. One belief may lead to various individual value decisions. In some ways, laws try to assign a particular value system or persuade people to follow one.

These components make up the ethical framework for any discussion on ethics. By focusing on any one of these components exclusively, individuals are not really discussing ethics. Not understanding this framework leads to the dichotomy in design discussions by critiquing values and not the wholistic ethical framework of ideas or products at scale. I believe it is important for designers to consider the context and ethics for whom they are designing and design for ethics, not only values.

(left) Brief etymology of the word ethics; (center): Ethics word use in printed books over time; (right) Use of the word ethics online in the past 5 years

Intra- and Interpersonal (Personal) Relationships

The design dichotomy here is that designers help build technology intended to assist individuals connect and build a stronger network of morals, but instead, the technology isolates individuals through artificial empathy and alone connectedness, which weakens the network. This focus aims to pit organizational and individual values against each other without thought of the overarching ethical considerations. Thus, ignoring the morals and beliefs for those they are designing.

Intrapersonal and interpersonal communication studies–not media communications–explore the relationships and interactions one has within their own mind/self and with small groups (less than seven) of people or objects at any one time. These daily interactions develop one’s sense of beliefs and morals about what is right and wrong. Due to reciprocity theory, humans tend to surround themselves with individuals who think, act, and talk as they do. Thus, reinforcing one’s world view. These relationships tend to create more stable ethical behavior because of a similar belief and value set. As Sherry Turkle discusses, when technology is placed as a mediator between these individuals, whether it be text message or AI assistant, the connection is lost and the individual’s relationships with what is right or wrong becomes distorted through alone connectedness.

Questions to consider:

  • What impact will artificial empathy have on a person ‘s life? A generation’s life?
  • How will a new product intentionally or unintentionally isolate individuals?
  • When are individual’s ethics being superseded by organizational need?

Social/Community Relationships

The dichotomy here stems out of an organization’s need to have committed users to support their stakeholders while users want to be in and not just feel like they are in complete control of their world. The organization thinks less about the community and more about the organizational values. Due to the organizational success using applied decision-making psychology users looking for community among a landscape of decision landmines.

This brings up the sticky idea of building technology for communities of people. B.J. Fogg from Stanford’s Persuasive Technology Lab has been studying human decision making and behavior in relation to value and choice for many years. Building on Fogg’s research and research from the decision sciences, many organizations have been using this research not to nudge behavior through libertarian paternalism or character development; instead, they tend to short-circuit the natural community beliefs, values, and morals by appealing to what Kahneman calls our “system one brain” or “lizard brain” way of making decisions through choices that have the least resistance. Our system one brain responds to quick and easy tasks while system two is more analytical and purposeful.

To entice system one to take the lead technology uses distractions, notifications, large “add a comment” boxes on social media, and auto-play videos, to only name a few, that temporarily distract us and keep us living in the moment and doing what’s easy rather than choosing what we really want (or need) to do, as Tristan Harris critiques. Thus, companies and organizations are using our psychology to build addiction to a platform–just as many books posit including, Don’t make me think. As the addiction grows, the new lawmaker or owner of the platform begins to retrain communities about appropriate morals or beliefs. This type of approach often is labeled as “dark patterns” or “frictionless decision making,” which is often cited as one aspect of increasing the victim culture that is present in many countries around the world.

Questions to consider:

  • How does this platform design impact current and future decisions?
  • Is what we are making based on my values or the values of the users?
  • Is interrupting and user’s decision-making process adding value to their lives or only to mine/ours?

Global Relationships

Thinking of ethics at a global level. Picture credits

At a global level, design decisions become even more difficult. Designing across communities presents designers with the dichotomy of designing for organizational values, local values, or for a wholistic view of ethics. This is the scale at which most designers tend to begin critiquing ethics through tweets, blogs, snide comments, and transition design. However, it is easier to debate individual ideas about how something should be done rather than the morality or beliefs surrounding a principle or concept. One, example is from Zuckerberg quote below. Facebook is not the only company facing this dilema but may be the most public in recent months.

“However, that may be good for the world but it’s not good for us unless people also share back to Facebook and that content increases the value of our network. So ultimately, I think the purpose of platform — even the read side is to increase sharing back into Facebook.” — Mark Zuckerberg

Looking forward in time, designers must be willing to consider and how their designs shape or impact the daily ethical decisions of individuals they may never see or meet. Whose values may be completely different and discrete from their own. Instead of griping about the unfairness look beyond the personal values and design for global ethics.

Questions to consider:

  • Why are we designing around the emotionally triggered values of a society instead of the beliefs and morals?
  • What is the intended action or goal of a particular design, and how could it be misused?
  • How do designers align morals and beliefs while maintaining individual values?

My take-away

In short, I believe that an ethical design only exists if the complete ethical package of beliefs, morals, and values inform the goals of the persuader/designer and are aligned with the goals of the individual being persuaded.

“Do we design for the world we have or do we design for the world we want?” — David Danks

I welcome your thoughts and comments on this topic.

--

--

Josh LeFevre
Dichotomy

I am human who grew up loving science who realized that the bloom of design brings life and context to humanity while making science approachable.