Consent in the Age of Ambient Computing

Hessie Jones
beacontrustnetwork
Published in
6 min readSep 7, 2021

Author: Becca Vanneman

Today’s technology was made to extract as much value from users as possible. How can we start to build a virtual environment and economy where users’ needs are placed first?

It’s Monday morning. Your diabetes test results just got in from the doctor. You’re nervous, so you don’t immediately open the email. As you inhale your first cup of coffee, you see the notification — an ad for insulin. Seems strange, so you ignore it and start reading the news. Your timeline is filled with articles about keeping your blood sugar in check, new treatments for type 2 diabetes, how to lose weight. How do advertisers know before you even opened the letter? Who else might know? Could this raise your insurance costs? What else could happen?

Before the rise of Google and other tech giants, the principles guiding our economy were generally aligned with the consumer: identify a need, create a product to fill that need, sell said product. The rise of the internet and discovery of data as the “next oil” upended this more symbiotic order in favor of a gold rush to extract as much data as possible from consumers to sell to advertisers, political candidates, and in the form of algorithms that predict and modify future decisions. Now the internet has become one giant surveillance tool sold to us under the guise of enhanced convenience. As Shoshana Zuboff says in Surveillance Capitalism, “Users [are] no longer ends in themselves but rather became the means to others’ ends.” [1]

What Is The Price of Free?
There’s no doubt technology has made our lives easier in many ways: boosting productivity, allowing us to connect with others around the world, sharing knowledge. It’s become integral to participation in today’s society, but innovation that violates our privacy takes one step forward and two back; technology can progress without forcing us to lose freedoms.

“Google glass” had the potential to be a useful technology to augment reality for tech-forward users but ultimately turned out to be a tool of surveillance used to gather data from the users in even the most intimate settings. As soon as consumers recognized this, the product was discarded. Despite this breach of consumer trust, Google then partnered with Levi’s to create an “interactive denim” that tracks your movements and other biological data in order to do things like control music and headphones. Facebook’s new wristband designed to tap into your nervous system foretells the further collection of biological data. Even your car could be spying on you, and that data includes texts, photos, calls, social media, and more. These are all signs of an economic order that does not have us, regular people, at its core.

Why is the collection of this data relevant to the functioning of these products? How does this type of invasive technology improve our lives? Though it has the potential, we are not reaping the full benefits of these technologies since they weren’t designed with our needs centered. Though we’re given free tools, resources, and information which add immediate, tangible benefits to our daily lives, we don’t understand the backend; the data we generate using these products is a resource that can be used to manipulate our decision-making, nudging us to make choices that we wouldn’t make ourselves. This can be as innocent as buying some tchotchke online or malicious as being convinced to vote for a candidate who doesn’t share your interests. We become tools, means to an end that seeks to control, cajole, and categorize with ever more ease. The effects are dire: increased bias, feedback loops which penalize the most marginalized, and a future that looks all too Orwellian; all for the sake of higher profit-margins.

Bullied into agreement

Right now the tech world models after companies that use a Hobbesian approach to governance: users relinquish certain rights and freedoms to a benevolent sovereign who, in turn, provides services to them. The United States revolution, however, was based on the principles of John Locke, who suggested that government should be based on the “consent of the governed.” In Locke’s Second Treatise of Government he asserted that government is legitimate only when it satisfies the fundamental needs of the community. A government that violates the trust of its people loses their consent. Big tech breaches our trust by exploiting our needs for connection and access to information, our trust in their systems, and our psychology to get us addicted. It’s time we hold them to account.

The fundamental problem with this market is that ignorance and uninformed “consent” are at the heart of the data economy’s new logic. Though we click “agree” at the bottom of privacy policies and other agreements, legal experts call these “contracts of adhesion,” because they impose take-it-or-leave-it conditions on users that stick to us with no opportunity to negotiate. Once we agree to the terms of service, we can’t go back[2]. Worse, on most sites, simply browsing represents acceptance of whatever terms they decide to apply. Further, these terms can be altered at any time, without our knowledge or consent. Tech companies are bullying us into submission without regard for our wants or needs.

How We Get to a Future of Transparency and Informed Consent

These days it’s impossible to live without Google, Facebook, or many others, who hold monopolies over our data and lives. Companies and schools require you to use the Google ecosystem. Interest groups, companies looking to hire, and others leverage Facebook to communicate. Like it or not, we have become citizens of the internet, or, as Rebecca MacKinnon puts it in her 2012 book Consent of the Networked, “netizens.” Accordingly, we should know who is violating our rights.

The first step is transparency; if we are able to understand how companies use our data, we can more meaningfully make the decision to use a product. To read every privacy policy you encounter in a year would require 76 work days of the year, or 25 full days, and often a company gives your data to multiple third parties who have separate policies governing their use of your data[3]. Once entering the Nest ecosystem of connected devices and apps, understanding how Google Nest uses your data requires reading nearly one thousand contracts, and if you opt-out, the product declares functionality and security “deeply compromised” and no longer allows you to install updates, leaving you vulnerable to cyberattack or frozen pipes[4]. Wouldn’t it be nice if there were another way?

By refocusing on people and creating human-centric products, we can realign companies with our values and begin to get on a better track. So, how do we do that? One example is BEACON’s new product Pulse, which allows people to find companies with overlapping values by making transparent to end-users where their data is going and how it’s being used in an easily readable format. This puts the consumer and the company back into conversation: they understand their relationship and the user can make informed decisions.

Progress and technology can exist without permissionless collection of our data. If we can imagine a world where privacy is a given, where we can use technology without sacrificing our right to confidentiality, we can build it. Tech companies cannot be left to govern themselves, as they continue to demonstrate their first priority is profit, not people, through their constant violations of our freedoms and rights. It starts with transparency, meaningful consent, and human-centric design.

We deserve better. We deserve to be treated as ends not means. Humans, after all, compose society, not technologies.

  1. Shoshana Zuboff, Surveillance Capitalism, (New York: Profile Books, 2019), 88.
  2. Shoshana Zuboff, Surveillance Capitalism, (New York: Profile Books, 2019), 48–49.
  3. Alexis C. Madrigal, “Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days”, The Atlantic, March 1, 2012, “https://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-in-a-year-would-take-76-work-days/253851/
  4. Zuboff, Shoshana. “Surveillance Capitalism and the Challenge of Collective Action.” New Labor Forum 28, no. 1 (January 2019): 10–29. https://doi.org/10.1177/1095796018819461.

This article is part of BEACON Newsletter. If you would like to subscribe, please Sign up.

--

--

Hessie Jones
beacontrustnetwork

Advocating for human-centred & fair distribution of#AI and #DataPrivacy — Author/Writer, Forbes, Co Founder MyData Canada, PIISA.org, Women in AI Ethics