Does Anybody Actually Care About Privacy?
Our lives are now open source.
The privacy press is exploding again today. If you installed Pokémon Go on your iOS device, Niantic Labs probably has access to your entire Google account — emails, searches, contacts, etc.
So what?
No really, so what?
Effectively, privacy is a red herring. People will cede nearly all of their data and all of their expectations of privacy for a discount, a service, or really any perceived benefit at all.
Consider that:
_Facebook shrugged off massive privacy concerns around its Messenger app by deferring to the default permissions settings on Android and iOS — as of April 2016, the service had more than 900 million users.
_In a global poll, Intel Security found that fully 70% of people would trade their personal data from smart home appliances for a coupon or discount.
_IBM found that the percentage of consumers willing to share GPS data with retailers nearly doubled from 2013 to 2014 to 36%. Fully 38% of consumers would provide their mobile number to retailers (32% would share social handles) for the purpose of receiving text messages from retailers.
_Uber’s privacy policy allows the company to record the location of their customers’ devices even when the app is not in use.
_In an unprecedented consumer-focused move, massive data broker Acxiom launched a website in 2013 to allow people to customize and correct the data stored about them in a bid for more relevant personalized advertising — of course, just to sign in you have to enter part of your social security number. (No matter, they already had it.) After most people who tried the service discovered their profile was way off, critics charged that this much-lauded move for “transparency” was actually a way for Acxiom to crowdsource a free data cleaning service.
_The Future of Privacy Forum offers DoNotTrack.us — an opt-in service that theoretically asks web sites and advertisers: “Don’t collect and store any information about me without my explicit permission.” But the two associations controlling 90% of online advertisers actually interpret “do not track” as “do not deliver targeted ads” — they still collect, store, and monetize your data.
This isn’t to say we don’t all have vices, habits, and historical tidbits we’d prefer to keep to ourselves, but most also instinctively recognize that:
more data made public = less room for stigmatization of our secrets
more data made public = more necessity for machine analysis (and therefore less human judgment)
more data made public = stronger algorithms for better services
more data made public = more “free services”
more data made public = decreased visibility for any one individual’s data
Ben Franklin famously cautioned that those who would trade liberty for security deserve neither. But that’s not the only trade on offer today. Security is a slippery thing (privacy even moreso), but convenience, personalization, friction-free and “costless” services are tangible and concrete.
As the principled Wikileaks/Anonymous/libertarian/Snowden crowd know, it can be dangerous and countercultural to pursue and protect privacy. And while Sweden has a church and a political party dedicated to piracy, a whole generation in the US has been scarred by horror stories of unwitting grandmothers getting sued for a friend’s need to download Oochie Wallie Wallie. If a life observed is not a free life, then liberty has been eroded indeed. In a complex world, the path of least resistance is simply not to care.
But the reality is more nuanced than mere apathy, though there is some of that; the definition of “privacy” itself is shifting. Open source becomes a more valuable metaphor when you consider other adages: hiding in plain sight; vulnerability is strength; safety through transparency. In a society where classic privacy seems inaccessible short of a bunker and a tinfoil hat, why not try radical openness? Could there be a sort of anonymity — or at least security — that comes with conscious construction of community, as already seen in some fringes of our digital society?
Pragmatically, most of us know that the usernames, passwords, and even the encryption we rely upon to provide a comforting facade of privacy are useless in the face of Chinese supercomputers and black hat hackers. Confronted with a specific breach of our own personal data, we suddenly care about privacy a whole lot. But we also know we’re helpless.
Sure, phrase your survey correctly and you could get 90% of people to disagree that “it’s fair for companies to collect information about me without my knowledge in exchange for a discount.” But when it comes down to it, we are willing to share most of our information. The same survey found that 84% of people strongly or somewhat agreed that they wanted to have control over what marketers could learn about them, but 65% agreed that they had come to accept that they had little control over it. And the evidence of that resignation is everywhere.
According to Stanford’s Paul Saffo as quoted in Pew’s 2014 report on The Future of Privacy: “While Americans claim to care about privacy, they care even more about convenience. Americans have happily sacrificed their privacy over the last several decades, and will continue to do so, even as they complain. Privacy has already shifted from being a right to a good that is purchased.”
But privacy is not for sale — at least not yet. What is on offer in the near-term is a smarter world: a world where our machines know things about us and can make suggestions to optimize our lives. A world where more openness does sometimes mean more security. A world where we can collaborate to provide better services, knowing that in all of our individual uniqueness, we are not alone. A world where the launch of a mobile game brings people together, privacy be damned.
