Forget elections — the data sharing that keeps me up at night is much more fundamental
Several weeks on from the initial story breaking, we are still feeling the fallout of revelations that data gathered from unwitting Facebook users was used to build a system allegedly to target individual American voters. As well as taking billions of dollars from Facebook’s stock price and perhaps a few thousand users from its network, the revelations seem to have awoken public consciousness to the extraordinary power wielded by those in Menlo Park, Mountain View and elsewhere on the Californian-Cascadian coastline. Then again, previous experience suggests that the half-life of this fallout may be depressingly short, as public recollection of similarly radioactive disclosures from Edward Snowden in 2013 suggest.
But regardless of how long it lasts, it is worth delving into what has made this most recent scandal so toxic in the short term — and chief amongst these, I think, is its proximity to politics. The ongoing investigation into Russian interference in the presidential election has provided a consistent drumbeat of revelations into quite how far the Russians seem to have reached into American heartlands — not, mind you, into voting machines, which are too unevenly dispersed to effectively hacked, but instead into the minds of those pulling the levers, the voters themselves.
Democratic elections are rightly still held to be a sacred thing, and various laws seek to preclude undue influence over how people make up their minds, from Britain’s purdah, to Italy’s par condicio, to the dying embers of campaign finance law in the US. Perhaps the most important outcome of the Cambridge Analytica scandal and the past eighteen months more broadly is the forceful restatement of the principle that understanding of how people vote and why should be left to the psephologists.
Whether and how this principle can be remade in practice for our interconnected world is still an open question, and we need to get to work on answering it. But at the same time, the public spotlight currently trained on this scandal offers an opportunity to make hay while it shines, by showcasing some of the myriad other incursions into personal privacy that sustain our information-industrial complex.
Three revelations in the weeks following the Cambridge Analytica story serve to highlight this broader threat. As it happens, each one relates to behaviours even more fundamental to human life than free and fair elections. These are eating, sleeping, and having sex — each of which are considered “physiological” needs and sit at the foundational base of Abraham Maslow’s famous hierarchy of human needs. It transpires from these revelations that intimate details about our personal characteristics, in even these most fundamental of realms, have been captured, but not contained, potentially exposing sensitive private information.
Take sex. Earlier this week Grindr, a dating app popular in the gay community, confirmed reports that it has routinely been sharing its users’ HIV statuses with two analytics companies to optimise its product. Though the specific data concerning users’ statuses and their last test date was anonymised when shared, research has suggested that because identifiable information such a user’s location and email address was also shared, sometimes in plain-text form, HIV statuses could be inferred by malicious actors.
Or take our basic need for food. Just yesterday, news broke that customers of Panera Bread, the American bakery-cafe chain, may have had their personal data including names, email addresses, physical addresses and even the last four digits of their credit card numbers exposed online. Customers belonging to the chain’s loyalty program and those who ordered food through its website were among those potentially affected. Though the information leaked was likely less sensitive than Grindr users’ health data, it was much more easily findable, with initial analysis suggesting that “records could be indexed and crawled by automated tools with very little effort”.
But these aren’t the only of our basic physiological needs currently being captured as data and potentially exposed online. Data about our sleep, too, is seemingly at risk. I’m a loyal user of the popular sleep-tracking app Sleep Cycle, which uses your phone’s built-in microphone to assess your sleep patterns and wake you up in your “lightest sleep phase”. When activating the app recently I was surprised to see the name of my housemate flash up on the screen next to a “link” icon. Contrary to the app’s help pages, which suggests that the feature is designed only for those sleep in the same room, in practice the app seems to consider sharing a WiFi network as a proxy for sharing a bed. (For the record, our rooms are at either end and on different floors of our two-storey flat.)
This “feature” has no off switch, meaning the only way out is to switch off WiFi on your device or not use the app. But given that sleep is by definition an unconscious activity, it’s certainly possible that many users — especially those who happen to go to bed early — have no idea that the app will by design broadcast the fact that they are asleep to people in their vicinity.
At first glance this feature may seem entirely innocuous, and in their response to my emailed complaint, Sleep Cycle’s team didn’t see any particular urgency in addressing this concern. (It’s been 30 days since I reported the issue, and the feature remains unchanged.) But delve a little deeper and it becomes clear that this sort of feature is ripe for real abuse. It’s not common practice, for example, to change your WiFi password when someone moves out of a shared house — or a co-habiting relationship, for that matter— but given that anyone who has access to your WiFi may now have access to live data about when you’re asleep, perhaps it should be.
These are the sorts of security and privacy risks that, well, keep me up at night. One doesn’t need to doubt the sincerity of Grindr’s desire to optimise their app or Sleep Cycle’s interest in experimenting with new features to improve their users’ experience. But even if their motives are pure, these apps are responsible — legally and ethically — for the data they collect, and the way they use and share it. The Cambridge Analytica saga has rightly woken us up to the danger of our elections being sabotaged. But sensitive information about human behaviours as fundamental as eating, sleeping and having sex are also at risk of exposure, Facebook’s facepalm may be just the start.