The Accountability Dilemma in VR
A recent visit to the London Design Museum ended up raising lots of questions for me about the societal implications of virtual reality (VR).
Specifically, it was an exhibit in the Design section of the Designer, Maker, User exhibition that got me thinking about the challenges that accompany the advancement of VR.
On display were two highly successful designs that were created during and just after the second world war: an AK-47 assault rifle (a.k.a. Kalashnikov) and a leg splint. Each was extremely well designed in its own way — both performed their functions very effectively, were easy to use and applied available materials in an ingenious way. However, one was made for the purpose of healing while the other was designed to maim and kill.
The point of juxtaposing these items, which can be viewed as the cause and effect of the same outcome, is to make us think about what ‘good’ design actually means and if there is a moral or ethical component to design. In other words, can something be considered ‘good’ design even if it is created for the purpose of harming or even killing others? Or should design only be considered ‘good’ if it improves the world? We are then confronted with the moral question: Should a designer be held accountable for the way in which their products are used or is this ultimately the responsibility of the user?
As someone who is extremely passionate about new technologies and particularly excited about the potential of virtual reality, this really made me think. I believe we are not far off from a time when VR will become a critical part of our everyday lives, from helping us learn new things, train for jobs, discover places, experience adventures, meet new people, and undergo medical therapies and treatments. This platform, aka the Final Platform, has so much potential to improve our lives. But the flip side is that if VR can be used for a positive or ‘good’ purpose, it can also be used for a negative or ‘bad’ one, even if not intentional.
Let’s take the games industry as an example. This sector has always been an early adopter of new technologies as die-hard gamers are always looking for the most sophisticated and realistic gaming experience and are willing to pay for the privilege. We can already find numerous examples of successful VR games that might be considered somewhat violent, such as Pavlov VR and Serious Sam VR. This isn’t surprising — shooter games are extremely popular and giving players the opportunity to shoot in more real-life environments understandably adds to the thrill. And if these games are proving successful, it seems obvious that the VR games developers are on the right track and should continue down this path, making them more and more realistic.
The question is: Do VR developers have a responsibility for how the technology and content they are creating is being used?
Perhaps we should narrow this down further as there are two elements here: the technology and the content.
If we are referring to the actual building of the technology that allows us to experience a virtual world, then I would say that the developers of this technology do not have a moral responsibility to how it’s being used. In my view, that would be like saying that the person who created the motion picture camera is responsible for the advent of pornography.
If, however, we are referring to the actual content being created to use with VR headsets, then this is a different question altogether. So far, there is no proof that that violent VR games create violent people — especially as at this stage, no VR game is graphically powerful enough to make you believe you actually killed someone. However, as Dante Buckley, creator of popular VR shooter game, Onward, points out in an interview for UploadVR, “… when things do start to get more real for a game like Onward, or another first person shooter, there’s going to have to be a responsibility for people to consider.”
But should this responsibility only be considered for VR content that promotes violent or ‘negative’ behaviour?
A few months prior, I had attended a panel session on VR In Real Life at a conference. During the Q&A after the discussion, an audience member asked the panel how they thought violent and pornographic, or ‘bad’ VR content should be monitored and who should be held responsible for ensuring it doesn’t get into the hands of children. The panelists seemed to all agree that it’s really up to parents to monitor their children’s use of VR headsets, just like they monitor their children’s use of TV, computers, games consoles and mobiles. They also agreed that eventually there will be a need for some sort of regulation around VR content, as there is on other content platforms.
It then occurred to me that while there is an obvious concern around how violent VR shooter games can potentially have a negative impact on the users, (especially children), we will soon be confronting another challenge that is just as worrying, if not more.
I quickly raised my hand: ‘What about ‘good’ content? I mean what about content that makes us feel better about ourselves…immersive content that gives physically disabled people the ability to climb mountains and cycle through the countryside, or virtual chat spaces that give socially-inept people the chance to live vicariously through their avatars who are popular and confident? What if these people prefer their virtual lives and don’t want to leave the virtual world because they no longer know how to — or want to — cope in the real one?’
One of the VC panelists shrugged his shoulders and answered ‘So what? Let them stay in their virtual worlds if they want to! That’s not our problem.’
That response really struck a chord.
While it’s true that it’s still too early to know what the consequences of long-term exposure to VR will be, it’s clear to me that as an industry, this very much is our problem. Of course we can’t be held accountable for all the ways in which people will use the technology, but we can ensure we create the best possible content by 1) working with psychologists, educators, neuroscientists and the medical industry as a whole to research the effect various immersive content will have on people before it goes to market and 2) developing the expertise (i.e. specialist therapies) that will be needed to deal with the after-effects on people who are immersed in VR content over long periods of time.
Simply washing our hands of any accountability and continuing to develop more realistic immersive content for the sake of it (i.e. because we can) is not an option, and can lead to harmful outcomes.
I strongly believe that as an industry, and as a society, it is our responsibility to ensure we are building technologies whose main purpose is to serve humanity and improve our lives overall.
I welcome your thoughts on this topic and look forward to reading your comments below.