TalTech’s Re:creation VR First Lab: Towards Intelligent Immersive Virtual Environments
Few would deny that as a technology virtual reality (VR) has come a long way — from complicated installations available to select few research facilities and failed attempts to introduce the technology to the consumer market to modern VR headsets and coherent real-life applications; from very skeptical views to overly generous market predictions; the history of VR has had its fair share of twists and turns.
Today, one can expect virtual reality to provide a seamless immersive experience transporting the user from his or her physical environment into a virtual world within which anything is possible similar in this respect to, say, a lucid dream. However, this scenario still belongs to the realm of science fiction. So what are the limitations? What does it take to overcome them? Shouldn’t present day technologies deliver on this front?
These were the questions that the author of the present article asked himself after having his first VR experience in 2014. The questions seemed disruptive — it felt like it was the time to act to be able to contribute towards the emergence of that experience making it possible to learn, to design, to make something extraordinary through the use of VR medium by making the virtual environment (VE) adapt and meaningfully respond to the user’s actions.
This was at the end of 2014. TalTech’s control system research laboratory — which the author was a part of — had just acquired several second-gen Oculus Rift Development Kits to explore the possibilities of VR. Following a year of preparations and negotiations, Re:creation Virtual and Augmented Reality laboratory was established at the beginning of 2016. Thanks to its favorable position in the Mektory international business and innovation centre, the lab was able to quickly join the VR First network and set its course towards excellence in VR research.
The name “Re:creation” is derived based on the following ideas:
- Our aim is to recreate real-life environments and objects, i.e., to create so called digital twins simultaneously achieving persistent immersion.
- In e-mail exchanges, “Re:” stands for “Reply” or, equivalently, “Response”, i.e., the “answer to”. The very existence of the laboratory is, metaphorically, a response to the emergence of VR and AR technology, while “creation” obviously refers to the act of creation of artificial environments.
- Recreation usually refers to a relaxing and enjoyable time. And we really enjoy our work!
Nowadays, Re:creation lab is a unit of Centre for Intelligent Systems, Department of Computer Systems, School of Information Technologies, Tallinn University of Technology.
From Architectural Walkthroughs to System Intelligence
When we started Re:creation, we wanted to probe for ideas to identify the most viable research direction. One thing was obvious — we had to leverage our previous experience and knowledge. We started distributing relevant thesis topics to students who happened to be very interested in this novel technology.
Also, since 3D modeling is an indispensable asset in developing coherent VR applications and we happened to lack this experience, our first foray into VR was related to recreating architecture and natural environments.
Architectural walkthroughs is a popular type of application of VR. One of the first applications that were developed in Re:creation was a walkthough application of Mektory building which was intended for conducting virtual tours.
At the point of developing this project, it became apparent just how important interactions were in VR. After all, to deliver a truly immersive experience, it must be possible to manipulate the virtual environment. Opening doors, moving furniture and various assets, turning lights on and off — all these features were very attractive for the users walking inside Mektory’s digital counterpart. This project also afforded us the possibility to observe how users behave in VR, especially in the case of large scale architecture.
One of the following projects was ordered from the lab by a company — Swedbank AS. This was a student project the end result of which was a game that included two stages — a tutorial stage and a game stage. The latter had the player collect Swedbank related assets under an oak tree in a forest-like environment within a time limit. Apart from offering the VR user a challenge, the developed game served a very important purpose — it provided first-time users with a learning environment where they would be able to acquire the understanding of the mechanics of VR including navigation and in-game object interaction. The game was received well by the company who called it “The Swedbank VR Experience”.
In 2017, the Re:creation lab team members traveled to Munich, Germany, to meet with colleagues from a research group working in the Leibniz Supercomputing Centre. There, we learned about some of the problems the visualization team was facing when working with large data sets in VR. Since visualizations were based on data downloaded in real time from the database, it was essential to estimate which part of the data would be needed in the immediate future so that the network bandwidth requirements would be met. Thus, it was also necessary to predict the VR user’s behaviour and actions.
For a moment, we thought about our initial observations of users in VR. Then came the realization. Most of our team members have a machine learning background. So, why not automate the task of discovering user behavioural patterns and apply the findings to action estimation in a large scale visualization environment? A lot of useful information can be derived from that so that a truly responsive VE can be built on top of that knowledge. Not only that, but advanced interaction mechanisms can also be developed, including adaptive collaborative environments.
This was the point when the main research and development direction of the laboratory was finally identified. It can be summarized as follows.
Re:creation laboratory strives to fully utilize computational intelligence methods to devise a new generation of cyber-physical systems that give rise to an unprecedented level of immersive experiences aimed at exploring complex data visualizations, human-machine, human-system and human-human interactions and collaboration in a highly detailed virtual or augmented artificial environment.
Modeling Human Behaviour: Towards Unprecedented Interactivity with the Virtual World
Modeling VR user’s behaviour, actions, and emotional states is where both the VR medium and machine learning methods come into play. The general idea of the research project can be stated as follows: developing technological means for accurate mathematical modelling of human behaviour based on motion and brainwave data collected in the VR environment using computational intelligence methods and application thereof to the problem of visualization of large datasets in in VR or AR environments and also to design of adaptive user interfaces. We now focus on key issues that are investigated in the context of the the project.
- Modeling human motion provides two important advantages: first, accurate real-time tracking of the user in a VR environment to provide the user with a seamless immersive experience; second, prediction of user’s behaviour allows to preload the necessary data from a large visualization dataset.
- Developing methods for the user to interact with data based on the behaviour model — novel user interfaces are designed at this stage.
- Creating a collaborative environment where several users can interact with the visualization and among themselves while working on a single problem.
This project aims in the long term at solving the problem put forth in the beginning of this article — to provide the user of VR with a truly immersive and interactive virtual world.
In 2019, the project received a grant from the School of Information Technologies of TalTech. The project is spearheaded by the head of Re:creation VR&AR laboratory Aleksei Tepljakov and Ph.D. candidate Ahmet Köse. The project is ongoing and first results have already been documented in research papers which were submitted to relevant conferences and accepted for publication. It is envisioned, that starting Fall 2019 the project team will be further expanded.
Digital Twins of Control Objects: Academic and Industrial Applications
Nowadays, new educational facilities for education and professional training are developed taking advantage of the user’s ability to interact with virtual environments in a meaningful way. In this sense, introducing VR to university level control systems courses is seen as a very welcome step. Having a visual feedback from the studied control systems coupled with the ability to interact with these in a natural manner brings about a huge improvement in the understanding of the underlying theory due to the hands-on nature of the experience.
Since founding members of Re:creation have a control systems background and teach relevant courses in the university, the emergence of a suitable project allowing more students to gain insight into system dynamics through hands-on experience with simulated control objects was imminent.
In this project, a VR based application providing an alternative for a physical control systems laboratory is proposed. We claim that the solution is innovative, low-cost and efficient, and is able to alleviate inconvenient conditions in CS laboratories by presenting software-based three-dimensional (3D) replication of physical control objects that are mathematically modelled and can be interacted with by the user in a VR environment. Accurate mathematical models of existing control objects or control problems based on the real-life objects located in the Alpha Control Systems Research laboratory at Tallinn University of Technology are developed. The dynamic models are identified, validated, and successfully deployed using the MATLAB/Simulink environment. Next, accurate replicas of control objects are prepared as 3D Models. These are optimized and deployed in immersive environment for VR purposes. The mathematical models are connected to the visualization in the VR environment via User Diagram Protocol (UDP) socket which allows efficient real-time communication thereby also allowing the implementation of seamless user interactions. We refer to this process as that of creating digital twins of control objects.
Inducing Synesthesia: A Study of Technologies supporting Sensory Exchanges
With the following project, we step into a more extravagant research domain. Still, if successful, this project can yield very useful results.
Synesthesia — the act of experiencing one sense modality as another — is an interesting phenomenon that provides many exciting opportunities when applied in a VR context. The ultimate goal of the project is to provide means for inducing voluntary synesthetic experiences through the VR environment. In our earlier work, we have reported initial findings related to acoustic localization and sound processing based on prerecorded data. Later, we developed a prototype designed to deliver the synesthetic experience to the listener in real time
The current achievement of turning 3D sound occurrences into 3D visual signals represents a step of a larger programme. The ideal target is the ability to completely swap senses, i.e., turning all audio signals that can be heard by an individual human being, from arbitrarily many sources into video signals and, vice versa, turning everything that can be seen by a person into a sound carpet.
We consider the ideal concept of sense swapping as a research challenge to drive the development of increasingly better tools in sense-crossing and sense-mapping. In particular, if integrated into augmented reality scenarios, real-life applications for such technologies can be found immediately in numerous domains: from architecture over education, supportive technology for the elderly and people with special needs, therapy and rehabilitation, to rescue systems and all kinds of mission-control systems.
While there are several projects that the team of Re:creation VR&AR lab is working on, the ultimate point of convergence still seems to be in merging advanced modelling and machine learning methods with the VR/AR medium. It is also expected that apart from delivering on the research front, the activities of Re:creation will also result in coherent industrial applications including but not limited to
- Intelligent immersive virtual environments including learning environments providing the user with advanced adaptive user interfaces for either accomplishing a certain task or for an improved learning experience;
- Providing a framework for implementing digital twins of control objects in an industrial context and at-scale and means to interact with them;
- Introducing a framework for sense swapping in a VR or AR environment.
We are also looking forward to collaboration opportunities! If any of the projects described above are of interest to the reader, then we are happy to receive comments and suggestions via email firstname.lastname@example.org.