Quitting Facebook & Google: Why Exit Option Democracy is the Worst Kind of Democracy

Janet Vertesi on what it means to have a principled rejection of technology

U.S. Navy ejection seat test, China Lake, California, 1967

Just how easy is it to opt out of corporate data collection? Are you just a luddite if you do choose to opt out? And what are the costs?

Speaking Tuesday at the Princeton University’s Center for Information Technology Policy was Janet Vertesi(@cyberlyra), who specializes in the sociology of science, knowledge, and technology. Janet has spent the past seven years studying several NASA spacecraft teams as an ethnographer. At Princeton, Janet teaches classes on the Sociology of Technology; Sociology of Science; Work, Technology and Organizations; and Human-Computer Interaction.

Exit Option Democracy and Voting with your Feet

The basic assumption of markets is that people have choices. This idea that “you can just vote with your feet” is called an “exit option democracy” in organizational sociology (Weeks, 2004). Opt-out democracy is not really much of a democracy, says Janet. She should know–she’s been opting out of tech products for years.

Today, Janet tells us about four personal experiments to opt out of technology companies: (a) platform avoidance, (b) infrastructural avoidance (c) hardware experiments, and (d) the idea of digital homesteading.

Janet reminds us that she doesn’t hate tech companies. She first encountered computers as a six year old, and she was a BBS operator for her high school, and she publishes computer science research, some of which she did while at Yahoo Research. Janet’s perspective on technology is shaped by her work on science and technology studies, which looks at how technologies come about and the invisible values they bring with them.

As a scholar, Janet uses methods of “infrastructural inversion” to reveal the invisible, sunken infrastructures and their categorical assumptions, bringing them into view for analysis (Bowker 1994, Bowker and Star 1999). Janet also uses methods of Critical Technical Practice, which deploy critical theory, analysis, and reflection on the unconscious assumptions in the process of design and building alternative systems (Agre, 1992; Sengers et al 2005). Throughout this work, when asking questions about the values of technology, Janet tries to ask, “Could it have been otherwise” (Pinch & Bijker 1984).

Janet describes her opt-out experiments as infrastructural inversions that show how hard it is for anyone to opt out.

Janet Vertesi describes the values behind her opt-out project

Opt Out #1: Going Google-Free

What does it take to get off a platform? In 2012, Google decided to aggregate and correlate data across their platform sites, including YouTube, Maps, Mail. They used this data for pigeonholing users for the purposes of advertising sales (“a better user experience”).

As a sociologist, Janet was incensed. First, we know from sociology that people need to have the right to behave differently in different settings (fundamental to Goffman and 60 years of sociology). For Google to create a single identity was also risky. At the time, Arvind Narayanan, Ed Felten, and others were showing that combining datasets like this could de-anonymize people’s identities.

At the time, Janet was in a long term relationship with Google. She was one of the first people to sign up for gmail. The company knew she was engaged before her colleagues did. All of her coordinating was on Google, all of her relationships were on Google.

How do you break up with a tech company? The first step is to get your stuff back. So Janet downloaded all her data. Just like someone in a relationship, the service tried to emotionally manipulate her to stay: “Are you sure you want to do this?” “Our servers are feeling unloved!” It was all emotional blackmail. Janet recently went off Facebook, and they use the same kind of emotional messages– breaking up is hard to do.

What can you do once you’ve broken up? Janet needed to find ways to stay out of the relationship.

To prevent Google from accessing her data, Janet practices “data balkanization,” spreading her traces across multiple systems. She’s used DuckDuckGo, sandstorm.io, ResilioSync, and youtube-dl to access key services. She’s used other services occasionally and non-exclusively, and varied it with open source alternatives like etherpad and open street map. It’s also important to pay attention to who is talking to whom and sharing data with whom. Data balkanization relies on knowing what companies hate each other and who’s about to get in bed with whom.

Janet also adjusted her life for living without social network software. She created a list of her friends, made sure to call her friends, made a list of hobbies, and sent a calendar reminder to her friends with her birthday.

Finally, Janet decided the minimum systems she needed and limited her use just to those things. For example, Janet uses Twitter to connect to astronomers, and uses it only for that.

Overall, going Google free was was straightforward, says Janet. You can create a program and stick to it. But platforms will still have your data, and many things become harder, including relationships family and friends. You will probably be blamed, again and again, for social inconveniences.

Janet Vertesi describes techniques for avoiding data sharing by stores and websites

Opt out #2: Data Infrastructure

What does it take to avoid the personal data dragnet? Now that tech companies share more and more data with each other, we rarely have the choice to opt out.

Janet decided to prevent the internet from knowing about her children

Around the time Janet became pregnant with her first child, she learned that online and offline purchases and browsing related data were being combined to facilitate targeted marketing. Pregnant women are especially vulnerable- if companies can identify pregnant women early, they can be the first to influence long-term purchasing decisions. So Janet decided to prevent the internet from knowing about her children.

How did Janet try to prevent companies from knowing about her two children? The first step was to browse the internet using Tor, making it harder for companies to track her browsing behavior. Janet and her husband also used cash, gift cards, and amazon gift cards to make purchases, linking them with Amazon lockers that were disconnected from their identities.

Trying to de-link your identity from data storage has consequences. For example, when Janet and her husband tried to use cash for their purchases, they faced risks of being reported to the authorities for fraud, even though their actions were legal.

Social capital and network effects keep people conscripted online

How can you check that your attempts to stay out of the data infrastructure are working? To find out, Janet’s family created inversion techniques to monitor corporate data collection through the ads they received in the mail.

First, they created special email addresses linked to a series of mailing addresses, with purpose specific credit card numbers from privacy.com, and released those emails only to certain sources. They received only three baby product catalogs during this experiment. In one case, a friend had added their child to an online service when sending an invitation. In another case, an Australian company that sold parenting products moved to San Francisco and started selling their data.

Opting out of data infrastructure for five years has involved many social consequences. One day, a family member sent a private message on Facebook congratulating her for her upcoming child. Janet immediately deleted all the messages and unfriended the family member. The family members didn’t realize that Facebook reads private messages. Janet has also lost substantial social capital from friendship networks by not exchanging baby news. Janet says that her experiences reveal how social capital and network effects keep people conscripted online.

How easy is it to opt out from data sharing? Several friends have tried to do this project, and no one has succeeded more than 4 or 5 months.

Opt Out #3: DIY Hardware: Building Your Own Smartphone

Janet tells us that phones are also a leaky source of personal information and data. Phone numbers and other identifiers allow companies and governments to trace calls and locations. Apps and OSes require access to our data. And only a small number of companies control all of this data.

The obvious answer is to build your own smartphone, so Janet used parts from electronics kits to make her own 2g phone. After making the phone Janet quickly realized even a privacy-protecting phone can’t connect to the network without identifying the user to companies through the network itself.

Since then, Janet has experimented with phones and OSes that offer more security features or that are just no longer monetized by companies. For example, Janet started to use Nokia MeeGo after the project became defunct. She has also used the Sailfish OS on the Sony Experia X, which is an open hardware design.

Even a privacy-protecting phone can’t connect to the network without identifying the user to companies through the network itself

When working with open hardware phones, Janet has used a range of inversion techniques to reveal the nature of data tracking by trying to avoid it:

  • avoid the two major mobile phone makers
  • install apps via open OSes, using command line installation or no app store
  • use obsolete devices that companies are no longer monitoring and monetizing
  • accept lower grade networks for more options (2G, 3G, non-LTE)
  • keep duplicates and triplicates because old phones break
  • avoiding the IMEI/SIM problem is hard. Janet is thinking about launching her own satellite

Opting out in this way also has social consequences. Sometimes the network doesn’t work well, and people can’t always reach you. You also have less access to apps. Degraded service requires adjusting expectations. Janet tells us that no luddite could ever op out; only a dedicated hobbyist ever could do this.

Opt Out #4: Digital Homesteading

Is it possible to have a smart home without giving companies direct access to your home? To ask this question, Janet has turned to the history of homesteading to think about a life where people create their own digital infrastructures, independently from companies.

The life of a digital homesteader is in a continuous state of disrepair

What is life on the digital homestead like? Janet’s family has their own media library, their own calendar on a server they run, and a baby cam that they independently maintain. All of this infrastructure is served on their own home network, served from a Raspberry Pi that they built. Janet tells us about the techniques for digital homesteading:

  • keep personal data on hardware or services you own or control
  • purchase hardware that can be re-appropriated, programmed, and locally controlled
  • deploy open source equivalents
  • de-consolidate user accounts, and use them to track where your data is going
  • run your own DNS
  • dual or multi-boot computers to access a range of software as needed

The life of a digital homesteader is in a continuous state of disrepair. If you’re a tinkerer, this is a lot of fun. Unfortunately for essential services, it can also be dangerous. Janet’s family also lost a lot of information security through digital homesteading.

Exit Option Democracy is Not an Option, and Not Democracy

Janet says that these experiments worked because they reveal how companies use data and infrastructure and because they show how hard it is to opt out. As Janet found, opting out introduces constant trade-offs, showing that exit-option democracy is not only not an option, but not democracy.

Janet concludes with further thoughts on the implications for research and design:

  • Security through scale versus security by obscurity is a terrible trade–it leaves out data ownership concerns
  • Studies of Internet of Things, platforms, and online systems assume we use corporate infrastructures and not DIY options.
  • We need more attention to individualized security and system choice within heterogenous networks in the context of constant repair– what Janet calls “in the box” security
  • Privacy is not just a set of “who can see what” controls: can we permit users to dictate where and how their data is stored and accessed, not just by users but by machines, algorithms and companies?