Even if you turn off Google’s Location History option in your phone settings, it was recently revealed, the company still collects your location data “through services like Google Maps, weather updates, and browser searches.” This comes after rising public awareness of mass data collection programs like those of Facebook, which enabled firm Cambridge Analytica to unethically use the data of unsuspecting platform users. In short: lots of companies are collecting our data, and analyzing it and selling it, pretty much all the time.
Amidst a heap of journalistic pieces on modern privacy and data collection, an important point remains largely unstated: the “issue” isn’t necessarily with data tracking itself, but with data tracking by default.
Human evolution has resulted in a number of mental shortcuts — termed cognitive biases — that allow us to make decisions faster than we otherwise would. Generally speaking, we could argue, this has helped our species survive over the centuries; if we could act instinctively, that likely helped us (and still helps us) avoid a breadth of threats. But that doesn’t mean cognitive biases come without their drawbacks.
As Richard Thaler and Cass Sunstein examined in their groundbreaking work “Nudge,” defaults — that is, the default selection in any set of options — pose a particular vulnerability in our cognitive decision-making. Because humans are quite lazy, we’re not going to put extra effort into changing a selection unless it matters a lot. Thus, everyone from companies to politicians can set default options that implicitly or explicitly guide our actions. In other words, others can exploit our cognitive biases to achieve, for the most part, a desired outcome.
This is one of the most pressing yet overlooked issues at the heart of mass data collection: almost all modern data tracking, from Google’s collection of real-time geolocations to Netflix’s monitoring of our watch history to an operating system’s collection of usage statistics, is enabled by default. This means we’re more likely to leave data tracking activated, as we so often do on our smartphones and laptops, because changing it takes time and effort. And this also means that if we don’t know the tracking is there — like many do not — we won’t navigate to the settings in the first place. Data tracking continues, and the data trackers achieve their goal.
A 2008 piece in Harvard Business Review is one of few articles that directly connects defaults to mass data collection, in this case documenting Facebook’s 2007 program which displayed users’ purchases by default. “Unless they actively opted out,” the authors wrote, each customer’s behavior would be “posted where all their friends could see them.” The company was using defaults — exploiting biological flaws in human decision-making — to engender a desired response. Recent events show that such practices only continue today as companies use our clicking of “Agree” as informed consent of their data policies.
To call data collection “complex” would hardly capture the scale of the activity and the nuance of its challenges. In a world run by machine learning and other automated decision-making tools, which dictate everything from home purchases to prison sentences, data tracking is like kerosene on the fire of a company’s profits. The very incentive structure of many corporations means that drives for growth will not permit the scaling back of data tracking — at least not without external influences.
Amidst conversations about Google, Facebook, and other firms, we must recognize the broader trend of data tracking by default. Little will change until this default is switched, and little will change until consumers are aware of these practices — and their often default nature — in the first place. Data tracking is enough of an issue without companies quasi-hacking our decision-making processes too, so it’s time we recognized this problem.