This is a seven-part miniseries I’m doing on privacy and unauthorized surveillance in the digital age. Check out links to the other posts at the bottom.
So far in this series, we’ve learned that TV manufacturers can track everything you look at on the screen and Wifi systems in your home can read your lips and monitor your heart rate. One thing they can’t do is read your emotional state. But if you use Virtual Reality, clever coders will be able to do that soon enough. And they won’t stop there. Armed with a data set billions of human interaction cases to build from, they will fine tune VR experiences to influence and manipulate our feelings and judgments like nothing we’ve ever seen in history.
I read this fascinating article by Joshua Kopstein, The Dark Side of VR, in The Intercept. I’m going to quote from it extensively. Of course people have been hyping virtual reality for decades and the hype was empty. But now Facebook has invested billions into the Oculus VR system. Google is making VR available to the masses with Cardboard. We may be reaching a tipping point. VR will allow for all kinds of new experiences for entertainment and education. It will also be used by companies and governments to be able to learn more about us. Kopstein writes:
As the tech industry continues to build VR’s social future, the very systems that enable immersive experiences are already establishing new forms of shockingly intimate surveillance. Once they are in place, researchers warn, the psychological aspects of digital embodiment — combined with the troves of data that consumer VR products can freely mine from our bodies, like head movements and facial expressions — will give corporations and governments unprecedented insight and power over our emotions and physical behavior.
“The power with VR” writes Kopstein, “is that sensors pick up all these other signals from users — from where they look, to how they turn their head to how they squint their eyes.” You might have a VR headset on to look at what’s on your screen, but the headset could also be looking at you. Are your pupils dilated? What were you looking at when that happened? Did you jerk your head back all of a sudden in surprise? It can all be known.
To give one example, “Yotta Technologies, a VR company based in Baton Rouge, Louisiana, claims its platform can detect a user’s emotional state using an array of sensors mounted to a VR headset, reading microexpressions by tracking eye and muscle movements in the face.”
The power of big data is clear in advertising. It allows marketers to analyze which messages provoke the most response with consumers. They can A/B test different landing pages and Google Adwords until they find which precise message most frequently leads users to respond. Programmers can use this data not just to give you what you want, but to make you more likely to feel a certain way.
Facebook, whose data scientists in 2012 conducted an infamous study titled “Experimental evidence of massive-scale emotional contagion through social networks,” in which they secretly modified users’ news feeds to include positive or negative content and thus affected the emotional state of their posts.
Nudging into belief
As much as we’d like to think we are completely rational beings, it isn’t so. Behavioral economists like Richard Thaler and psychologists like Amos Tversky and Daniel Kahnemann have proven otherwise. They’ve found that our brains make systematic errors of judgment by making decisions not based on logic or statistical evidence, but based on a variety of biases, heuristics, and environmental factors. In other words, we’re far more susceptible than we might believe to being influenced by things other than “the facts.”
In a 2014 paper Dublin City University researchers proposed that AI-controlled avatars could be used to manipulate people’s impressions and judgements. By designing the VR avatars in certain ways, they might “nudge” users into accepting certain ideas or views. Imagine the subtle implications if “An avatar might respond with a smile if asked about one political or religious idea, and frown when discussing another.”
Malcolm Gladwell shared a chilling example of our susceptibility to believe things for illogical physiological reasons in The Tipping Point.
“Here is another example of the subtleties of persuasion. A large group of students were recruited for what they were told was a market research study by a company making high-tech headphones. They were each given a headset and told that the company wanted to test to see how well they worked when the listener was in motion — dancing up and down, say or moving his or her head. All of the students listened to songs Linda Ronstadt and the Eagles, and then heard a radio editorial arguing that tuition at their university should be raised from its present level of $587 to $750. A third were told that while they listened to the taped radio editorial they should nod their heads vigoursly up and down. The next third were told to shake their heads from side to side. The final third were the control group. They were told to keep their heads still. When they were finished, all the students were given a short questionaire, asking them questions about the quality of the songs and the effect of the shaking. Slipped in at the end was the question the experimenters really wanted an answer to: ‘What do you feel would be an appropriate dollar amount for undergraduate tuition per year?’
The answers to that question are just as difficult to believe…The students who kept their heads still were unmoved by the editorial. The tuition amount that they guessed was appropriate was $582 — or just about where tuition was already. Those who shook their heads from side to side as they listened to the editorial — even though they thought they were simply testing headset quality — disagreed strongly with the proposed increase. They wanted tuition to fall on average to $467 a year. Those who were told to nod their heads up and down, meanwhile, found the editorial very persuasive. They wanted tuition to rise, on average, to $646. The simple act of moving their heads up and down, ostensibly for another reason entirely — was sufficient to cause them to recommend a policy that would take money out of their own pockets. Somehow nodding, in the end, mattered…” — Gladwell, The Tipping Point
Imagine viewing a political advertisement in VR that caused you to look up and down in order to see it, in a sort of nodding movement. I don’t want to believe I’d be that easy to manipulate, but the research in these areas is pretty compelling. Maybe I won’t be as susceptible, now that I’m aware of it. On the other hand, maybe influence takes less action than I think. Maybe, through analyzing billions of data points, some data scientist will find that the degree of nodding he needs to get me to do to affect my thinking is miniscule. Maybe it will be so slight that I’ll do it without even noticing it.
All of this has implications that are good, bad and ugly. The good is that with all this data, makers are going to be able to fine tune entertainment and education for us. They’ll maximize our attention and interest like we’ve never experienced before because they’ll be able to monitor our every physical reaction and design to hit the high points. The bad is that we might be manipulated for our consumer dollars using the exact same methods. The ugly is that a powerful governmental organization could use these same tools to influence and control people in ways we’ve never imagined in the real world.
Read widely. Read wisely.
“The Dark Side of Virtual Reality: Virtual Reality Allows the Most Detailed, Intimate Digital Surveillance Yet” by Joshua Kopstein in The Intercept.(11 minute read)
Like this post? Give it a heart below!
For more good reads on culture and society, sign up for my weekly email.