How might we evaluate dark patterns?

Katie McInnis
3 min readOct 15, 2019

--

Dark patterns, present across platforms and devices, work to undermine consumer choice and autonomy — but we currently have no framework for evaluating them. How might we evaluate these deceptive design interfaces to better support consumer empowerment?

Photo courtesy of Baldiri (CC-NC-BY-SA)

Dark patterns are design tools used by online products and services to nudge, manipulate, or push consumers towards certain actions which benefit the company. Design elements such as hard-to-find buttons and confusing menus are classified as “dark” when they seem to manipulate consumers in ways that seem unfair.

A company could employ dark patterns to encourage a consumer to spend more money, share more data, engage more with the platform, or discourage users from deleting their accounts or data. Harry Brignull, a designer who is credited with coining the phrase, maintains a “Hall of Shame” where people have contributed screenshots from organizations as diverse as PayPal, National Geographic, Quora, and a company that sells first-aid kits.

Retail sites in particular commonly use dark patterns to encourage consumers to spend more. For example, hotel and travel sites use dark patterns such as false scarcity and ticket sites use a countdown clock to get consumers to make decisions about tickets quickly, thus leaving many to miss or feel forced to accept the surprise additional charges the company added to the ticket price in the final stages of purchasing. In addition, a recent investigation from ProPublica found that Intuit, the parent company for H&R Block and TurboTax, used deceptive designs and misleading advertising to trick lower-income Americans into paying to file their taxes — even though a federal statute protects their right to file for free. This deceptive design pattern worked to possibly charge individuals who did qualify for free filing anywhere from $60 to $120.

But these design patterns are also so pervasive that it is hard to find a service that doesn’t nudge a consumer to do something that seems to advance the interests of the business more than their own. For instance, Spotify makes all their users’ listening sessions public by default. If a consumer takes the time to change their session to private one, the setting is so non-sticky, meaning so impermanent that the private session only lasts for six hours, and will make the session public once more at the end of that listening period.

All of these dark and deceptive patterns make it even harder for groups like Consumer Reports to evaluate products for privacy and security. For example, when we examined smart TVs at the beginning of 2018, we examined several TVs that allowed for consumers to control what kind of information is shared with the TV manufacturers. However, we had no way of evaluating the difficulty with which consumers could have in navigating to those choices. The menus were so complicated that we wrote an explainer for consumers to help them turn off privacy-invasive features of their TVs.

It is also worth noting that these patterns are presented with the modifier “dark.” Companies could use nudges and other tools to lead users to choose options that are more beneficial to the user and less beneficial to the company. Or the company could use neutral design patterns that allow for the user to choose what they most prefer from the options presented.

Do you have ideas about how Consumer Reports should evaluate dark patterns as part of our privacy and security testing? Please get in touch or comment on our dark patterns evaluation framework on our GitHub. We’ll be sharing more thoughts on dark patterns soon.

>Check out http://lab.cr.org or get in touch with @katielmcinnis to learn more.

--

--

Katie McInnis

Katie McInnis serves as Senior Public Policy Manager US at DuckDuckGo, an internet privacy company.