Do Dark Patterns Have Politics? The Politics of Dark Pattern

Claire Florence Weizenegger
17 min readDec 17, 2021

--

Introduction

Design is a very powerful resource. It subconsciously influences our everyday life in terms of behavior, thinking, and decision making. With any power, there is always a potential to misuse or abuse. Due to the technological progression, to access websites or applications is effortless. However, some of them manage to provide the user with happiness and joy, whilst others leave the user with confusion and frustration. Hence, within the design community and beyond, there is growing interest in the User Experience (UX) practice called dark pattern. According to Harry Brignull, founder of the term dark patterns, defined it as the following: “dark patterns are user interfaces that have been carefully crafted to trick users into doing things they didn’t mean to do.” Often they are not mistakes. They are int​​entionally designed to manipulate the user’s behavior to certain decisions — mostly related to shopping or privacy.

A famous example is the cookie disclaimer pop-up window when entering a website. It should be meant to inform the user about data collection and if needed to receive consent to it. However, these disclaimers are designed in such a manner that the user is confused how to engage with it. By using design skills such as color, hierarchy, and scale, the user finds themselves in a conflict between accessing the website and protecting their privacy. Should they choose the button they see first and that tempts them the most to get rid of the pop-up window or should they look for options that respect their privacy and security in order to use the website (Fig. 01). The second option usually leads to frustration due to endless clicks through the settings. In light of this, the question arises: Are dark patterns digital artifacts that display power dynamics in order to disadvantage the user in contrast to the company?

Figure 01: Pop-up Window ( Source: Adapted from vice.com/de)

Many scholars, especially from the field Science and Technology Studies (STS), critically questioned how and why objects matter politically (Shaw & Meehan, 2013). One of the most influential scholars in this regard is Langdon Winner. In his paper, Do Artifacts have Politics (1980), he argues that as soon as technologies do something or embody any action, politics arises. Winner identifies two ways in which artifacts can have politics. The first involves technical arrangements that require a particular sociological system. The second is centered around the idea that artifacts have politics if they are strongly compatible with a particular sociological system. As technology progresses, artifacts continue to shape modern society. At the same time, politics continue to be present in technological artifacts, and hence, makes Winner’s Theory 40 years later still very relevant.

One can easily assume that dark patterns fall into the category explained by Winner. In so doing, the question I address throughout this paper is: What are the politics of dark patterns? The emphasis lies on exploring the relationship between design, dark patterns, and sociopolitical power dynamics. Coming from a design background, I want to shed light on the politics of dark patterns to improve our ability to design equitable digital user interfaces. Eventually, this enables us — designers — to apply a human-centered design (HCD) approach based on the user’s needs and capabilities. Special thought is given to the impact for the people engaging with them. After all, good design should always begin with the user in mind (Norman, 2002).

Background — Framing Dark Patterns

In design, dark patterns are also referred to as deceptive design patterns. The term deceptive has its origin in Latin from decipere, which means to ensnare, take in, fool, cheat. In the 1610s, the English language took the term over from French deceptif (late 14c) that refers to “tending to mislead or give false impression” (Etymonline, 2021). When you think about the term’s origin, it becomes clear that it does not refer to something admired by users or the design community as a whole. According to human-computer interaction (HCI) scholars, they are considered unethical design practices because they are developed with a solid understanding of human psychology but do not have the users’ best interests in mind (Brignull, 2010; Gray et al.,2018; Gordon, 2019; Mathur et al., 2021). In contrast, a product that earnestly uses human psychology would benefit the user.

Even though the broad public critiques deceptive practices many businesses use them. The implementation of dark patterns have gained increasing popularity within venture capitalists, tech companies, subscription apps, and e-commerce businesses across the globe. It is tempting for businesses because they are quick routes to results; however, it is not sustainable. That being said, dark patterns are not only harmful for the design community and the users but also for the businesses themselves (Gray et al., 2018). Although it might be an easy route to boost revenue, it is at the expense of brand image, reputation, and user experience. The companies might cheat their customers once by selling an empty box, but it fails in the long-run. Sooner or later, the user realizes that they have been misguided and abandons the products out of distrust and frustration. Additionally, most companies are not aware of the negative long-term effects of dark patterns, mainly because there are no comprehensive published studies yet. This is mainly because dark patterns are considered a relatively new phenomenon.

Dark Patterns and Its History
The use of design tools to alter customer or user’s behavior is not a new concept, and it also appeared before the digital age. The retail industry, for example, has a long history of deceptive and manipulative practices that range on a spectrum from normalized to illegal. Some of these techniques, such as psychological pricing are normalized. It entails to make a price slightly below a round number, so consumers underestimate the cost (Fig. 02). This is perfectly legal and combines theories from behavioral economics, cognitive science, and psychology.

Figure 02: Pricing Retail (Source: Adapted from thegrocer.com)

Dark patterns are also moving within a thin line of legal and illegal. Therefore, dark patterns have attracted serious attention beyond the design community. Recently, regulatory and legislative forces in the U.S. and the EU have become eager to take actions against dark patterns. California just passed a Privacy Rights Act (CPRA) to regulate dark patterns. Even though this is a step in the right direction, some experts argue that policymakers are still struggling to counter effectively (Mathur et al., 2021). The main argument is drawn upon the fact that dark patterns are under- researched and thereby under-conceptualized. Consequently, dark patterns are hard to grasp, and thus, actions on governmental levels have remained limited in its effectiveness.

Behavioral Design — Alter User’s Behavior Through Technology
The use of psychological and social science theories to mislead users is also a well-known practice outside the design discipline. In economics, for example, they refer to as “ digital nudging.” It is defined as an approach based on insights from behavioral economics to change users’ decisions in digital environments (Schneider et al., 2018). Interestingly, the influence of user’s user behavior through digital interfaces are portrayed positively in most of the business and economics journals reviewed for this article. They state it as a value creation, and people actually desire not to be in control (Thaler & Sunstein, 2006; Mele et al., 2021).

Framing Politics

In order to draw parallels with Winner’s framework, some clarifications on the definition of politics are necessary. In the article “Do Artifacts Have Politics?” he stresses the urge to look at objects also in terms of their political structures. He claims that objects can embody forms of authority and power while suppressing others. He illustrates this point by telling the readers about the construction of an overpass bridge in Long Island, New York, by Robert Moses. Moses designed the bridge so that buses could not pass under it, which meant that people from poorer social classes, mostly people of color and other minorities at the time, could not access his public park. In other words, this artifact display power relations that discriminate the poor and black population of New York. In doing so, the politics of this artifact are literally built into stone.

In light of this, politics are understood as structures that influence freedom, power, authority, order, equality and inequality. Within this paper, it is not referring to politics in the context of policy and law making. By incorporating political relations, some groups are systematically disadvantaged because various systems were designed with the purpose to do so (D’Ignazio & Klein, 2020; Shaw & Meehan, 2013).

Technology and Ethics
With great power comes great responsibility, and hence the role of ethics becomes relevant. Therefore, to understand the relationship between dark patterns, morality, and their impact on our society, we must first understand the human relationship with technology at a fundamental level. Don Norman refers in his book Living with Complexity (2011) to technology as the application of knowledge to the practical aim of human life or the change and manipulation of the human environment (p.5). This points towards a direction that technology can be used to influence people’s behavior in favorable as well as unfavorable ways. Dark patterns fall into the second category because of their manipulating component. Why might this be problematic? Over decades scholars from various fields have studied what it means to manipulate someone. Especially in design, there is a thin line between persuading and deceiving the user. Eventually, scholars conclude that a manipulation occurs when there is a discrepancy between one’s intent and one’s action (Kasten, 1980). When this difference appears, the user has been misled. Thus, dark patterns shape UX practices in a very troubling and unenthical way.

“There are professions more harmful than industrial design, but only a very few of them. Advertising design, in persuading people to buy things they don’t need, with money they don’t have, to impress others who don’t care, is probably the phoniest in existence today. Industrial design, by concocting the tawdry idiocies hawked by advertisers, comes a close second” — (Victor Papanek, 1972)

Methodology

The paper is grounded as an analytical essay based on academic literature such as scholarly published and peer-reviewed articles. In addition, the topic is examined from an academic point of view to gain a broader understanding of how to design scholars are already addressing the issue and how it relates to the politics embedded in UX practices. This might offer interesting insights into the topic as many scholars have pointed out that dark patterns are a phenomenon that first appeared in the industry, and hence academica is lacking behind (Gray et al., 2018; Mathur et al., 2021).

The strategy to identify the politics of dark patterns is based on an analysis of how academic literature describes dark patterns and compares the findings to anchor points adapted from Winner’s theory. The theoretical approach is divided into three layers:

Layer 1: Types of dark patterns adapted from darkpatterns.org
Layer 2: Academic description allocated to type of dark patterns
Layer 3: Langdon Winner’s theory “Do Artifacts Have Politics” (1980)

By mirroring the layers described above, the politics of dark patterns can be derived. This will be further illustrated in form a case study based on sharing personal data while accessing a website. Finally, I will suggest ways for more ethical and sustainable UX design practices.

Limitations

Within the scope of this paper, the actual intention of the designer who crafted a dark pattern is not being considered.

Findings

Yes, dark patterns do have politics. These deceptive user interfaces can be seen as digital artifacts that entail an arrangement of inherently political forms of order, authority, and power while limiting users’ freedom of choice. By misleading the users into making unintended and possibly harmful choices is deeply rooted in political intrigues that systematically support power dynamics. However, this answer is not fully satisfying yet. Within this post, I have decided to further illustrate the politics of dark pattern with the example of Privacy Zuckering. Privacy Zuckering is especially dishonest as it is an attempt to mislead you into publicly sharing more information about yourself than you really intended to. It is named after Facebook CEO Mark Zuckerberg. It aims to trick the user into accepting privacy settings in order to access the user’s private information when entering a website or digital service. This scenario is particularly complex because it takes place at the intersection of design, privacy, security, law, and policy.

Privacy Zuckering
As society spends more time online, dark patterns become more present, and therefore individuals should become increasingly aware and conscious of internet privacy (Cranor et al., 2006). However, most individuals have little expertise or knowledge about the real risk of sharing personal information in a digital environment. A reason might be that most people find learning about privacy or reading the website’s privacy policy time-consuming, boring, or too complex. Besides the lack of knowledge, there are other reasons why users get trapped in disclosing more personal information as needed. For example, the users might have the illusion to control the digital interface. That being the case, deceptive design patterns are most likely not questioned or even noticed within that moment. After being deceived, they might blame themselves for not paying enough attention to the interface or not having read the text properly (Willis, 2020). Moreover, speed is considered a crucial factor in why people get trapped into dark patterns (Schüll, 2014).

Figure 03: Privacy Zuckering (Source: Adaptedf rom darkpatterns.org)

In academia, the design skills used to craft Privacy Zuckering is described as “Nagging” (Gordon, 2019). It includes attempts to mislead the user to an action that is deemed important for the business but not be for the user. Nagging patterns can not be dismissed or turned off by the user, often making them a repeated nuisance until the user opts-in or commits to the action desired by the business. Another important term for this pattern is “Forced Action”. It describes situations where users are required to do certain actions to access basic functionalities that usually should be accessible at any stage. In the case surrounding one’s privacy, it includes for example the act of providing more data than they would intend or need to. The acceptance button is designed in such a way that the user feels an urgency to accept it in order to access the website. By using color hierarchies, scales, and complex wording to create confusion, it appears easier or necessary to simply accept the privacy settings button when entering a website. However, most people don’t know that you can use any website without making those privacy settings. As explained earlier, this is a serious issues that is discussed not only in the design community, but also in other disciplines.

Many companies employ dark patterns to get data and use it for themselves or sell it to third parties. The aspect of companies taking advantage of their customers’ lack of awareness and knowledge to obtain private information and use it for their own profit is incredibly insidious, dishonest, and displays politics in many ways. Especially, when private data is going beyond its original use and is being sold to other stakeholders (e.g., advertisement). In so doing, the impact of dark patterns is the result of the infrastructure of politically motivated consequences. In other words, they are purposefully designed to achieve a specific effect. In this case, it is to mislead the user into accepting privacy settings in order to access their data.

In reference to Winner’s theory, they embody politics in two ways. First of all, they are inherently autocratic in clearly benefiting from an oppressive power dynamic. They display this because they intentionally mislead the user and hide significant information from them to take advantage of the cost of their privacy and security. Secondly, the deliberate intention of dark patterns supports political structures by not considering the impact of the social system engaging with them. This is especially present in the scenario when a user does not even recognize that they are engaging with a dark pattern. In this case, an information imbalance is used, which at the same time favors a power dynamic. Finally, some dark patterns are particularly dark and untruthful by restricting the user’s freedom by not giving them a “no” option.

Figure 04: Privacy Zuckering (Source: Adapted from darkpatterns.org)

Discussion

In order to answer the overarching research question, multiple themes have been derived. However, the following three were the most prominent in defining what the politics of dark patterns are.

First, dark patterns are designed to achieve a specific social effect. They present the business perspective, which is most likely in the advantage of a capitalistic system rather than serving the needs of the people interacting with them. Being more specific, dark patterns put the benefit for a business or other stakeholders over the people directly affected by it. In doing so, they are short-term oriented with monetary reasons as the main driver.

Second, it seems to be the case that dark patterns are intentionally designed to have a worse effect on users with low Internet skills than on people with good Internet skills. For example, uninformed Internet users are likely to be more vulnerable to privacy threats. In addition, people from lower socioeconomic groups are potentially more disadvantaged than people with higher levels of education. Hence, companies use the users’ lack of knowledge to create an information imbalance. This can result in serious consequences for the people who interact with these artifacts. For example, they are enticed to spend money or disclose personal information.

Considering this, my fundamental critique on dark patterns is not only about the design or construction of the patterns themselves but also on the impact those interfaces have on people. This aspect seems to receive too little attention in contrast to the intentions of the business in employing dark patterns. In purposefully deceiving the user into certain actions, they can lose time, money, and privacy.

Design Education
As a designer, I believe we have a certain responsibility to restore the balance that has been upset by digital technology. Thus, I propose actions that are heavily grouned in educational interventions. Because in order to respond to current cultural, technological and economic changes in the wake of global change, the next generation has to learn about design practices that evolve into equality rather than inequality. In particular, designers themselves have to become more reflective of the impact of their actions to its users and broader social system. In addition, designers rarely work alone, they mostly work for a client or an employer. Therefore, the designer’s role as a responsible executer might have the power to foster ethical design choices as opposed to dark patterns. Therefore, design programs should provide the necessary knowledge in educational theories and methods for ethical design practices (Frascara & Noel, 2012;Meyer & Norman, 2020).

To conclude, I truly believe that dark patterns must become a part of the curriculum in design programs. Besides that, it will not be enough to only raise awareness towards their existence. Students should learn how to apply specific frameworks or methods in order to generate reflective and society sensitive design interventions. This aligns with other design scholars who have suggested that promoting more sensitive design practice in the context of technology and its relationship to society may hold promise (Albrechtslund, 2006; Gray, 2018). One example of how this can be achieved displays the integration of moral sensitive values throughout the design process — from the beginning to the end.

Values
Often we refer to ethics as a rational system to determine if something is right or wrong. However, sometimes it seems unclear, especially when multiple stakeholders are involved (e.g., company, user, advertisement). According to Fansher et al. (2018, p.2), one of the most ambitious and promising approaches to ethical design practices is Value Sensitive Design (VSD). VSD is described as ‘‘a theoretically grounded approach to the design of technology that accounts for human values in a principled and comprehensive manner throughout the design process” (Friedman et al., 2001, p.1). Even though VSD might be a promising way, it misses to put emphasis on the relationship between the designer’s intention and the eventual use of the technology, in my opinion. Hence, the following values are partly adapted from VSD and created by my own.

Short-Term vs. Long-Term Focus: Does your design choice aim for a sustainable relationship with the user?

Reflect upon your design choices: How present are your own assumptions in the choices you made? What gets us in unethical design choices are often our own biases and assumptions. Especially if there is no real contact with the user, it is easy to assume how an artifact might be used.

Consider the impact for all the involved stakeholders: Who does your design benefit?

Satisfaction: How pleasant is it for the user to engage with the design?

Transparency: Can the user understand information in a clear and transparent manner?

Repeat this process.

Nevertheless, it might remain a challenge to achieve ethical design in HCI holistically as it combines principles from both theory and practice. This is a major barrier, as the gap between research and practice is considered a serious problem due to insufficient mobilization of knowledge. For example, in 2011 only 7% of HCI papers were published to support design practice most of the literature is centered around research (Culosso et al., 2019). This perspective underlines the urge to emphasis on design education even more. A design education that ensures to teach the next generation of designers about such values as part of their education — and not by publishing another theoretical paper.

Conclusion

This paper aimed to build on Langdon Winner’s theory of “ Do Artifacts Have Politics” and describe the politics of dark patterns. In so doing, dark patterns have been analyzed as digital artifacts with inherent political structures of power, authority, order that intentionally disadvantage some social groups over others. The power of design has been misused to restrict the user’s self-efficacy, which is particularly unethical. They were not designed with the best interest for the best people engaging with them nor considering the consequences.

As technologies continue to contribute to individuals behavior and social systems more broadly, it is sadly fascinating how relevant Winner’s article still is. All in all, we should pay close attention to the technical objects surrounding us and the meaning of those. Finally, I believe that digital artifacts can also be used to empower technology and people engaging with them equally rather than “power over” one another.

Bibliography

References

Albrechtslund, A. (2006). Ethics and technology design. Ethics and Information Technology, 9(1), 63–72. https://doi.org/10.1007/s10676-006-9129-8

Berdichevsky, D., & Neuenschwander, E. (1999). Toward an ethics of persuasive technology. Communications of the ACM, 42(5), 51–58. https://doi.org/10.1145/301353.301410

Büchi, M., Just, N., & Latzer, M. (2016). Caring is not enough: The importance of internet skills for online privacy protection. Information, Communication & Society, 20(8), 1261–1278. https://doi.org/10.1080/1369118x.2016.1229001

Colusso, L., Jones, R., Munson, S. A., & Hsieh, G. (2019). A translational science model for HCI. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3290605.3300231

Cranor, L. F., Guduru, P., & Arjula, M. (2006). User interfaces for privacy agents. ACM Transactions on Computer-Human Interaction, 13(2), 135–178. https://doi.org/10.1145/1165734.1165735

Deceptive (adj.). Etymology. (n.d.). Retrieved December 1, 2021, from https://www.etymonline.com/word/deceptive

D’Ignazio, C., & Klein, L. F. (2020). Data feminism. The MIT Press.

Fansher, M., Chivukula, S. S., & Gray, C. M. (2018). #darkpatterns. Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3170427.3188553

Frascara, J., & Noel, G. (2012). What Is Missing In Design Education Today? Visible Language, v46 n1–2 p36–37.

Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018). The dark (patterns) side of UX Design.Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3173574.3174108

Kasten, V. (1980). Manipulation and teaching. Journal of Philosophy of Education, 14(1), 53–62. https://doi.org/10.1111/j.1467-9752.1980.tb00539.x

Lukoff, K., Hiniker, A., Gray, C. M., Mathur, A., & Chivukula, S. S. (2021). What can chi do about dark patterns? Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3411763.3441360

Mathur, A., Kshirsagar, M., & Mayer, J. (2021). What makes a dark pattern… dark? Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3411764.3445610

Meyer, M. W., & Norman, D. (2020). Changing Design Education for the 21st Century. She Ji: The Journal of Design, Economics, and Innovation, 6(1), 13–49. https://doi.org/10.1016/j.sheji.2019.12.002

Mirsch, T., Jung, R., Rieder, A., & Lehrer, C. (2018). Mit digital nudging Nutzererlebnisse Verbessern und den Unternehmenserfolg Steigern. Controlling, 30(5), 12–18. https://doi.org/10.15358/0935-0381-2018-5-12

Norman, D. A. (2011). Living with complexity. MIT Press.

Schüll, N. (2014). Addiction by design: Machine Gambling in Las Vegas. Princeton University Press.

Schüll, N. D. (2016). Data for Life: Wearable Technology and the design of self-care. BioSocieties, 11(3), 317–333. https://doi.org/10.1057/biosoc.2015.47

Shaw, I. G., & Meehan, K. (2013). Force-full: Power, politics and object-oriented philosophy. Area, 45(2), 216–222. https://doi.org/10.1111/area.12023

Sunstein, C. R., & Thaler, R. H. (2006). “Preferences, Paternalism, and Liberty.” Preferences and Well-Being, 233–264. https://doi.org/10.1017/cbo9780511599743.011

Willis, L. E. (2020, September 21). Deception by design. SSRN. Retrieved December 1, 2021, from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3694575

Winner, L. (1980). Do artifacts have politics? Daedalus, Vol. 109, №1, Modern Technology: Problem or Opportunity?, 121–136. https://doi.org/ http://www.jstor.org/stable/20024652

--

--