by University of South Australia, SA Power Networks, Nova Systems, PTC, CSIRO
Shared Sphere is a Mixed Reality (MR) collaboration system enabling immersive 360 panoramic communication between remote users to solve technical problems that could not be solved by voice or video communication alone. It combines an Augmented Reality (AR) experience for a local worker out in the field with a Virtual Reality (VR) experience for a remote expert. The two users are connected through a live 360 panorama video captured and shared from the local worker.
Using 360 panoramic video conferencing, Shared Sphere supports view independence which allows the remote expert to look around the local worker’s surroundings independent of the worker’s orientation. Shared Sphere also incorporates situational awareness cues that allow each of the users to see what direction the other person is looking at. The technology also supports intuitive gesture sharing. The remote expert can see the local worker’s hand gestures in the live video, while the remote expert’s hand gestures are captured and shown as virtual hands to the local user in AR. Using hand gestures the remote expert can also point at objects of interest or even draw annotations on the real-world objects. The technology can be used for remote assistance in various fields such as inspection, maintenance, training and monitoring.
The Empathic Computing Laboratory at University of South Australia (UniSA) initiated the project in early 2017, developing the concept and technology. As the initial prototype system was ready for demonstrations, to collect feedback from potential end users and investigate commercialisation opportunities, UniSA (MOD. at UniSA)partnered with SA Power Networks, Nova Systems, and PTC. Based on the feedback and identified use cases and requirements collected from commercialisation partners, the team further developed the technology. Through collaboration, the project became focused on real world use cases and application, giving life and concrete motivation for further development of the concept and technology.
Currently the project is exploring new features such as supporting group collaboration, handheld 360 camera-based interaction, adaptive avatar representation, and combining 360 video with 3D reconstruction of the real world environment, partnering also with CSIRO. The project is scheduled to finish in 2019, yet further development is anticipated with commercialisation activities.
This project has been selected as one of the Auggie Breakthrough Awards finalists. You can take a look at their nomination page here.