Training Artificial Intelligence in VR
Virtual reality is widely becoming a tool of great interest for technology giants, who are making use of using this advancing technology to train AI and robots for real-world applications.
OpenAI for example, a company backed by Elon Musk, has been teaching robots to learn from tasks carried out by a human guide within a VR setting. This enables the system to go through every single possible move or decision it could make and learn which of them are most optimal for success. You can imagine this like an infinite number of alternate universes that form training scenarios for the AI to learn from.
The artificial intelligence is exposed to the task (in one example, to stack colored blocks) by a human counterpart using a VR headset and performing the task for the robot to watch and then imitate.
Virtual Reality has opened doors to this kind of advancement in deep learning, allowing for researchers and developers to carry out experiments on a larger scale than previously possible, without having to blow resources on practical tests — and in a fraction of the time. The VR simulation mimics the real world in terms of physics and gravity, allowing any inputs known to be functional and valid in VR to be utilized in the real world. A completely VR-trained AI can then be transplanted into a machine and put to use in a self-driving car or as a manufacturing robot, for example.
The intersection of VR and AI is already starting to become quite rich in that regard, but what is interesting to note is that while the method is fundamentally different, VU will also deploy AI to learn about people and their behaviors.
We are not placing artificial intelligence inside VR to participate in the same situations as a user or to play alongside with, but rather to focus on the player at the center of that effort and deploy AI to measure their bodily feedback. This lets us shape the world and experience around their responsiveness — or lack thereof — genuinely personalizing any time spent within VU.
Through deploying this kind of ‘engagement monitoring’ AI and by employing the principles of affective computing, we could, for example, look at the postures of 100,000 people who experience a particular event and understand how they react in that situation. This type of data will be able to measure how different contexts can affect a users’ response to an event, such as experiencing it solo or with other people. VU will soon be interpreting this data to enhance aspects of the simulation based on unconscious user feedback.
A useful byproduct of the collection of this sort of human behavioral data is the ability to utilize it in real-world scenarios, in ways that we would have never have anticipated. We may gain new insights into how a human learns further information, how interpersonal relationships can be formed and the roles that body language and speech play in these interactions.
This is especially exciting because we recognize we are on the precipice of a new age of growth and adoption for VR. If the VU project hits its targeted registered users this time next year, and we can monitor the enjoyment and response of a large number of users, we may very well be able to use this experience-driven data to enhance real-world systems and to shape the ideal model for human interaction in other areas of virtual and augmented reality.
Ciaran Foley is CEO of Ukledo and Immersive Entertainment, Inc. a Southern California virtual reality software company developing a new virtual engagement platform called Virtual Universe (VU).
Learn more about Virtual Universe and VU token by visiting our website and signing up for email updates, visiting our Github, following us onTwitter,Facebook, Linkedin, and Instagram, or being part the discussion onTelegram and Discord.