The missing ingredient: designing a tech future we’d actually want to live in

Elizabeth M. Renieris
The Startup
Published in
6 min readNov 29, 2018

Authored by Elizabeth M. Renieris — November 28, 2018

Photo by rawpixel on Unsplash

Upon reflection, 2018 felt like the year that law and regulation took center stage in the public discourse about technology. With the GDPR taking full effect in Europe and governments around the world scrambling to reconcile their legal frameworks with emerging technologies — including cryptographic tokens, artificial intelligence, and decentralized technologies like blockchain — never have so many non-lawyers been able to cite case law (like the “Howey Test”) or quote regulations (like the GDPR). The prominence of law marked a watershed year in the evolution of our most pervasive and global technology — the Internet.

The foundations of the Internet emerged in the late 1960s and 1970s with its progenitor network ARPANet stood up in 1969 and the emergence of the TCP/IP Internet protocol in 1978 to connect networks of networks. The Internet was (and to this day, is) little more than a technology layer — a global network of “nodes” connected by physical submarine and subterranean cables and wires, routers, and computers (i.e. hardware consisting of microprocessors, RAM, hard drives, etc.). For decades, its functionality was limited to moving packets of data around this network layer.

It wasn’t until the 1990s that Tim Berners Lee would create an application layer for the Internet (on top of this network layer) in the form of what we now call the “World Wide Web” or “the Web.” The Web ushered in the dot.com bubble and commercialization of the Internet through myriad businesses launching websites to sell their conventional products and services to customers online (e.g. Jeff Bezos launched Amazon to sell books in 1994). Thus, “e-commerce” emerged as a first-generation business layer on top of the network technologies. Alongside e-commerce, we also got a communications layer in the form of e-mail, Internet telephony (e.g. Voice-over-IP or VoIP technology), file sharing, and (eventually) streaming.

In the 2000s, the business layer of the Web expanded from e-commerce and communications to the commercialization of our online activity and behavior itself, with the emergence of online behavioral advertising and a digital economy fueled by digital advertising (i.e. the “ad tech model” of the Internet). This changing business model led to the emergence of new laws and regulations relating to the Internet, ushering in an era of “cyberlaw” and “Internet technologies law” (or “IT law”), adding a legal layer to the stack. These laws sought to regulate businesses based on an underlying technology that was never designed for such commercial purposes in the first place.

So here we are in 2018 with a tri-fold approach to navigating our common technology stack — the business, legal, and technology layers of the Internet — often referred to as the “BLT approach” (like the sandwich).

Photo by Tasting Table

But cyberlaw, IT law, and e-commerce related laws and regulations are proving inadequate for the challenges that we face in the modern digital era. This is due to a confluence of factors but, for purposes of this discussion, I’ll zero in on one — namely, that the lines between our offline and online worlds are blurring, which means that manipulating our activity online is increasingly a way of manipulating our activity.

We now leave a detailed digital footprint not just when we “surf the Web” online but also through each purchase that we make with our credit and debit cards (or Apple Pay), through Bluetooth- and GPS-enabled smartphones and smart devices, as well as through retail beacons, surveillance cameras, and other devices that track us as we go about our daily lives in the “real world.” Every second of our lives becomes a data point with the potential for concrete and significant real-world implications on, e.g. our ability to get a job, our mortgage or insurance rates, and even our political system and the quality of our democratic institutions. This increasing digitization of our offline selves in the form of “big data” risks extending the broken ad tech model of the internet to our lives at large.

Even the heightened attention to data protection and privacy laws is inadequate in the face of “big data” — a threat that goes beyond any one individual’s privacy or security. Given the sheer volume and nature of data now available to large corporates motivated by commercial exploitation and shareholder value, we have entered an era of what social scientist Shoshana Zuboff calls “surveillance capitalism” consisting of four core features:

  1. The drive toward more and more data extraction and analysis.
  2. The development of new contractual forms using computer-monitoring and automation.
  3. The desire to personalize and customize the services offered to users of digital platforms.
  4. The use of the technological infrastructure to carry out continual experiments on its users and consumers.

In the world of traditional research on human subjects, these kinds of activities would be subject to something like an institutional review board (an “IRB”) or an ethical review board (an “ERB”) to ensure that the proposed research methods are ethical. IRBs and ERBs undertake risk-based analyses to protect the rights and welfare of human participants in a given research study, to protect such participants from physical and psychological harm. They are also often subject to comprehensive and transparent international standards, governance, and oversight.

The problem we face is that despite widespread research and experimentation on human subjects by surveillance capitalists like Google, Amazon, and Facebook, there are no ethical rules or standards that apply to protect us. Ample time and resources are spent perfecting the business models, legal frameworks, and underlying technology solutions, but no one is looking out for the potentially harmful impacts on us as human beings first. This is where the BLT approach falls short. We need something more. We need what I call the “BLT-E approach” that incorporates a fourth dimension of ethics into the way we think about our most impactful technologies.

Source: Zagat

The risks do not end at commodification and commercial exploitation. At a recent international conference focused on privacy and data protection in Brussels, Apple CEO Tim Cook cautioned that “Our own information from the every day to the deeply personal, is being weaponized against us with military efficiency.” While calling for GDPR-style legislation in the United States to regulate the use of personal data, Cook’s language suggests a conversation that goes far beyond questions of law and policy to questions about the ethics surrounding the use of our personal data. The language of weaponization and militarization harkens to the laws of war and the rules of ethical conflict.

Lawyers have long had a role to play in this conversation, most prominently as part of the Hague and Geneva Conventions, which aim to establish international standards for humanitarian treatment in wartime. I believe lawyers have a similarly important role to play in the conversation around the ethics of big data. We have to go beyond compliance with the law (and even beyond things like privacy and data protection by design and default) — we need to shape the development of these technologies from an “ethics by design and default” point of view.

As Dr. Ian Malcolm cautioned in Jurassic Park — “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” When faced with an emerging technology or new use of an existing technology, we must go beyond whether we can (technically or legally) and ask whether we should from an ethical point of view. This ethical dimension is the secret ingredient to designing a technology future that we’d actually want to live in.

This story is published in The Startup, Medium’s largest entrepreneurship publication followed by +393,714 people.

Subscribe to receive our top stories here.

--

--

Elizabeth M. Renieris
The Startup

Founder @ hackylawyer | Fellow @ Berkman Klein Center for Internet & Society | Fellow @ Carr Center at Harvard |CIPP/E, CIPP/US | Privacy, Identity, Blockchain