Written by Bhavik Nagda and Max Langenkamp
Just prior to the Facebook’s nadir in early 2018 with the Cambridge Analytica scandal, former-executive Sean Parker lambasted the company for its ethical perversion. Facebook, he explained, was designed to exploit “a vulnerability in human psychology.” Features such as the “like” button were built to give users “a little dopamine hit” to keep their eyes glued to the screen. “The inventors [and] creators understood this consciously, and we did it anyway.”
Facebook, now mired in scandals and calumny, is perhaps the largest of many online players that compete dirty in the dynamic world wide web. The economic incentives play out such that many of these companies will go to great lengths to draw users to spend excessive time on their platforms. Facebook does this in many ways; the Messenger app, as of mid 2019, will prompt you repeatedly to turn on your notifications.
So, what’s wrong with this page?
This demand to notify may be so common that it’s difficult to immediately observe the perversion that is taking place. Consider the user who wishes not to have a visual banner pop up every moment one of their contacts chooses to message her. It is impossible for her to escape this nagging to turn your notifications on — as of early 2019, the ‘Remind me later’ option results in one quarter of your screen in the app permanently dedicated to demanding that you turn your notifications on.
These subtle and pernicious design choices have the effect of nudging you towards acting in Facebook’s best interest by spending more of your time on the platform (the sad symptom of what has been aptly termed ‘the attention economy’) whether your second-order self — the one who wants to spend less attention on Facebook — desires it or not.
This happens so often that in 2010, a UX designer built a website and coined a term to describe it: dark patterns.
Dark patterns are sneaky online interfaces that are crafted to trick users into taking actions that — directly or indirectly — generate more revenue. These practices, originally a weapon of marketers and advertisers, have been widely adopted in the tech industry as well.
Dark patterns encompass a whole host of deceptive online practices, ranging from forced continuity to friend spamming. When you sign up for a free trial with Amazon Prime, and they charge you as soon as the trial finishes, that’s a dark pattern. When LinkedIn sends invitations to all your friends as soon as you sign up, they’re guilty of dark patterns. Propublica has done excellent journalism exposing how Turbotax hid its free-filing page from search engines — a particularly malicious instance of a dark pattern.
Of course, the DETOUR Act won’t address all of these situations, but it certainly sends a signal that the government can prosecute and fine companies for such transgressions.
It’s bi-partisan legislation? Are pigs flying?
It’s clear that dark patterns are a detriment to society, and the government is indeed taking action. In early April, Senators Mark Warner (D-VA) and Deb Fischer (R-NE), in an act bridging the two parties, co-sponsored the DETOUR act to curb the online practice of deceptive user interfaces, known as “dark patterns”, that trick users into revealing personal data. Among other stipulations, the bill creates a governance and standards body within the FTC that addresses issues of user interface design and privacy for web service companies. It would also prohibit interface design that leads to compulsive usage for those under the age of thirteen. As of the writing of this article, the sponsors have drummed up considerable support for the DETOUR act, from both tech and policy communities including Consumer Reports, Common Sense, and Mozilla.
Critics of the act argue that it gives sweeping power to the FTC, and perhaps “make nearly all large web sites presumptively illegal.” Such undue powers may threaten the system of checks and balances in place, further exacerbate the regulatory capture phenomenon and the revolving door, and put a strain on the FTC’s existing resources.
At the core of the bill lies the following statement: “It shall be unlawful for any large online operator to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.” While certainly comprehensive, the abstract wording begs a number of questions: how does one assess purpose in creating a user interface? A user interface is, in a sense, a manipulated design medium; how does a regulatory agency assess the “substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice”? What constitutes as substantial and what is allowed?
The act also uses abstract language to define ‘large web sites,’ and fails to define terms such as ‘authenticated user.’ This ambiguity leaves the FTC and, ultimately, the courts to decide whether the conditions for litigation are met. However, as we have observed, judges aren’t often the best technologists.
Each of these nitpicky problems with DETOUR involve the implementation of the act, and solutions for such issues may exist. In a recent post, we explained OpenAI’s tech governance model that introduces a private market for technology regulators; implementing the DETOUR act may well be an excellent sandbox for OpenAI’s model. The DETOUR act indeed seems to spell out a series of regulatory goals without implementation details. Perhaps private regulators could fill the void, bridging the federal government and the tech world. Check out Jack Clark’s and Gillian Hadfield’s paper here.
Indeed, the act is a harbinger of regulation to come, a sign that certain members of Congress — despite their apparent division — are willing to regulate big tech in the interest of the public . DETOUR has been introduced at a precarious time for the industry, a time when presidential candidates are proposing policies to “Break up Big Tech,” when government agencies are collaborating to pursue anti-trust against tech companies, and when tech CEOs are being summoned to Congress from their thrones atop Mt. Silicon Valley by the dozens.
A report by the law firm Davis Wright Tremaine perhaps puts it most succinctly. “[DETOUR] is significant because it …would directly require industry and regulators — notably, the FTC — to look beneath the surface of “consent” by users…” DETOUR hasn’t been ratified and likely won’t, but it sends yet another important signal to the tech world and to the American constituency that changes are coming. “The implications for the information-based, advertising-based online ecosystem,” the report concludes, “could be profound.”