The conviction that technologies and technologists must be ethical is not new. It is, however, newly urgent. Ever-improving digital infrastructure, increasingly useful products and services, and intentionally persuasive interfaces mean technologies are widely distributed and almost universally adopted. It can be a vicious cycle. And so it goes that with great persuasive power comes even greater responsibility.
The Persuader’s Dilemma
To realize its intended outcome, a technology may use a combination of structure and information to force, persuade, and seduce a user. In the influential 1999 article “Toward an Ethics of Persuasive Technology,” Daniel Berdichevsky and Erik Neuenschwander introduce a framework for analyzing and principles for designing technologies. According to their framework, a persuader (i.e., technology designer) designs a technology that employs a combination of persuasive methods to change the attitudes or behaviours of the persuaded person (i.e., the user). In this scenario, the persuader subjects the user to their motivations and methods through the technology and, ultimately, bears the inherent responsibility for its outcomes. In this way, the means employed determine the nature of the ends.
The motives of the designer are the first challenge. Like anyone in a position of power, technology designers may take actions to maximize their own rewards (after all, self-interest is the ‘Invisible Hand’). To this end, companies set ambitious and quantifiable performance metrics (e.g., conversion rates, daily and weekly active users, time in-app) for each product they develop. Then, a variety of incentives are introduced to motivate designers and their teams to achieve these goals. Typically, these incentives include financial rewards and career advancement. However, they are also accompanied by ambient rewards like social acceptance for adhering to cultural norms, the pure novelty of bringing emerging technologies to market, or the more primal satisfaction of competing with a rival company. In combination, these incentives can be near irresistible.
Of course, a technology designer may also be motivated by a sense of duty to beneficial outcomes. Deontology is the ethical theory that an action is morally right by a set of rules or principles (it has been said that deontologists ‘can because they must’). To this end, a profession or company may enhance moral clarity for their practitioners with values statements, codes of ethics, or policies. The technology designer will also have their own convictions that further inform their process. These processes may serve to deepen a designer’s sense of empathy and understanding such that, as a matter of fiduciary duty, they align the methods and outcomes of the technology to the values of the user (see Value Sensitive Design). It is this messy combination of incentives and duties that introduce the primary ethical tension encountered by technology designers in any project.
Design for Good
To guide technology development towards beneficial outcomes, a designer may employ a Human-Centred Design methodology. That is, they engage in primary and secondary research activities to learn about the target users; including rounds of feedback and validation to improve aspects of the product until it is deemed acceptable and market-ready. This approach has successfully caused a sea change within modern business culture such that entire companies are organized around customer-first and user-centred principles. Further, it is a method that, if followed earnestly, can produce meaningful benefits for users, designers, and companies.
To increase precision regarding human values in design processes, Batya Friedman and Peter Kahn introduced Value Sensitive Design; a method that complements human-centred practices so that outcomes are, to the degree possible, especially values-aligned. Throughout the design process, a broad group of stakeholders are consulted and their values become key process inputs and acceptance criteria. Value Sensitive Design has also been influential in the formation of the Privacy By Design framework developed by Dr. Ann Cavoukian. (Friedman and her colleague at the University of Washington, David G. Hendry, just released Value Sensitive Design: Shaping Technology with Moral Imagination on MIT Press.)
Perhaps surprisingly, there is evidence that Facebook is using elements of Value Sensitive Design to combat misinformation. To generate awareness about the reliability of an article that is shared on its platform, Facebook engaged users in a research process to discover values of autonomy and expression (among what must be an otherwise expansive set). The resulting design solution is a button that provides important context about an article, its source, and the person who shared it (see Navigating ethical design in tech). However, human-centred and value-sensitive methods do not occur in isolation. They can be mixed with other methods that may not be so well intended.
Design for Persuasion
At the direction of the designer, a particular technology may employ any number of persuasive methods to ensure maximum success. Entrepreneur and author Nir Eyal decoded the growth secrets of leading technology products with the Hooked Model. This engagement-inducing sequence includes Triggers to prompt the user to attend to the product, Actions to engage users in behaviours with the product, Variable Rewards to provide the user desirable outcomes, and Investments to build value back into the product. For example, it is a Hooked Model that compels a user to engage a push notification, ‘like’ or comment on a friend’s post, check ‘likes’ and comments for their own posts, and subsequently post new content in hopes of more ‘likes’ and comments. In this way, technology users become effectively hooked.
Within a system like a Hooked Model may also lie, what London-based user experience designer Harry Brignull coined, Dark Patterns. Dark Patterns are intentionally deceptive interfaces that force, persuade, or seduce a user to behave in a way that solely benefits the company. Dark Patterns include, but are not limited to, the Roach Motel (where a user gets into a situation but cannot get out), Privacy Zuckering (where a user is persuaded to overshare), and Friend Spam (where a product requests permission to access a user’s friend list for one reason, and contacts them for another). While Dark Patterns are morally depraved, they can, in the short-term, be good for business (even if, in the case of LinkedIn, that means settling a class-action lawsuit for $13 million).
Finally, and taking from the field of cybernetics, Nudge Theory utilizes design principles to architect choices that alter user behaviour for their benefit. Nudge Theory, popularized by professors Richard Thaler and Cass Sunstein, is described as ‘libertarian paternalism’. It rejects the idea that design (and, for that matter, technology) is neutral. Instead, Nudge Theory dictates that all options remain available to users, but those options are presented in such a way as to influence the best or most beneficial choice. In our previous example, Facebook nudges users to engage their article reliability button by animating it. However, in alignment with Nudge Theory and Value Sensitive Design, engagement with the button remains optional to adhere to the value of autonomy.
Jobs to be Mediated
Regardless of the method, a technology must, first and foremost, offer instrumental value to the user. That is, it must be an effective means to an end. Determining the instrumentality a technology may have, at one time, been obvious. However, in the midst of a total digital revolution and a proliferation of innovations (ranging from the truly beneficial to the benign), designers require new clarifying frameworks. Jobs to be Done (JTBD) Theory unwittingly interprets Mediation Theory, as introduced by Don Ihde, in delineating both pragmatic and hermeneutic technological mediations for application by practitioners. Similarly, a Job to be Done identifies the user’s desired progress (or mediation) for a particular circumstance. It accounts for varying degrees of functional, emotional, social, and intellectual dimensions which must affect the design solution. In total, technologies mediate the very real goals, actions, and multidimensional experiences of a user. What ought a particular technology persuade a user toward? What ought the moral content of those outcomes be? One ethical theory is particularly instructive.
Virtue ethics emphasize habits and character traits that reflect attributes like freedom, peace, justice, love, happiness, and unity. According to virtue ethics, an action is morally right because it is what a virtuous person would do. In the case of technology design, the designer ought to adopt virtuous habits and character and intend the same outcome for the user. To this end, the virtuous designer may employ a combination of human-centred, value-sensitive, and persuasive means. If adopted as a maxim, the motivation to design for virtuous outcomes may help technology companies and designers generate both the moral imagination and specificity required to challenge the imposing forces of speed and growth that otherwise dictate the motives, methods, and outcomes of persuasive technology.
Indeed technologies are not neutral. They are necessarily designed to persuade users towards some combination of functional, emotional, social, and intellectual ends. Companies, designers, and users must learn to share the responsibility and hold one another accountable towards virtuous ends. Companies, for their part, can introduce incentives to motivate a values-aligned culture and adherence to codes and policies. Designers can themselves determine deontological principles to ensure their methods force, persuade, and seduce users to values-aligned and -sensitive outcomes. Similarly, users can cultivate a practice of assessing technologies for persuasive methods like the Hooked Model, Dark Patterns, and Nudge Theory to, if necessary, source alternatives for more beneficial outcomes. Truly this is a virtuous circle.