Privacy in a Virtual World
An Interview with Crystal Nwaneri, explorer of the boundaries between law, intellectual property, and high-tech public policy.
“Where Is the Future?” is a series of interviews with industry leaders considering the potential and complexity of technology on the horizon.
Virtual reality offers great potential for entertainment and social media, but the trade-off might be loss of privacy and physical harm. Crystal Nwaneri raised these concerns and more in her paper “Ready Lawyer One: Legal Issues in the Innovation of Virtual Reality,” authored when she was a student at Harvard Law School and published in the Harvard Journal of Law and Technology. Nwaneri became interested in VR while working at the Virtual Human Interaction Lab as an undergraduate at Stanford University. In our conversation, she offered examples of several issues affecting users that the VR industry has yet to fully address.
Should we be worried about privacy because major companies like Google and Facebook are involved in VR? Or is privacy an issue with the technology itself?
Crystal Nwaneri: I look at it as the general issue of the technology itself. VR companies collect data on our physical movement—how you move through space. A mobile VR device, like a Samsung Gear, is going to be a lot more limited because it can only track head position with the accelerometer in our phones. But more complex systems, like the Vive or Oculus Rift, not only track your position but also have a large setup within a room where they have cameras everywhere. There are controllers now that can follow your hands. They’re collecting unique data that they weren’t able to collect before.
Data is gold. Information on how you move might reflect some type of health condition. A person who just wants to experience VR might not imagine insurance companies [accessing that data.] Is that to say that’s actually happening now? I highly doubt it. You can look up VR companies’ affiliates online, including those they expressly say they are sharing your data with. [But] we should think now about what data is being collected so we can be conscious further down the line. Especially if the industry grows as much as people say it will.
I hadn’t thought about how health problems could be detected through motion itself. Are there any countermeasures or ways to protect this data while enjoying VR?
I would assume not, because the way VR works is by tracking your position and having a visual display accommodate that in order to make you feel like you are in some other world. I don’t know if it’s even feasible to think of a time where a VR company would not want to collect any data on how their hardware is used in games or in environments.
Is this complementary to privacy issues with other technology, or is it unique to VR?
I think it’s more complementary. It’s about painting a picture of a person rather than what a single piece of data can do to anybody. People were really excited about wearables, [but when using these devices] we give away data about where we walk, our location, how often we’re walking, possibly putting in more information into the app ourselves, like how much water we drank today and if we exercise. All this information goes to Fitbit, and then possibly to whomever purchases that information. [This is an example to compare to] as people are trying to figure out ways to use VR in more occupational, medical, and similar types of spaces.
Another point you made was about the physical risks of VR. Could you talk about examples of people who might be harmed by using the technology?
The users are blinding themselves to the outside world. What type of chaos comes from that? Oculus specifically has [an illustration in its guide] to use it in an empty room — who has random empty rooms to set this up in? They suggest what would be a safer environment, but also a fictional circumstance.
Another layer of physical harm is that users might freak out from what they’re seeing. Like horror [experienced in VR]: It seems very visceral, and people might act unexpectedly. They might run. We don’t know how being in these immersive environments will impact someone’s psyche. They’re having physical reactions. Their heart rate is increasing. They might panic.
Have you come across any VR companies actively addressing issues of privacy and the physical health of the users?
There has not been significant-enough physical harm for companies to have done more, other than their general safety guides. They all have some type of safety guide, whether they say sit down in a chair, clear the space around you, or don’t get into an environment that will trigger any type of anxiety or panic.
Are there other future challenges related to VR?
VR and gender. I’ve done a lot of research on online harassment over the past year. If there is wide adoption and we get more applications that are like social networks, what does harassment look like at that point? Will it be worse because it’s more visual? Or does the increased presence in this 3D environment mean that people will actually treat other people more respectfully? Optimistically, I would like to think the latter would happen. But I don’t know if I really believe that. People’s avatars can look like anything. It’s bad enough that people are affected by things tweeted at them, let alone something that somebody might say or do in an immersive 3D environment.