Update: We have published our project to Electronic Imaging Conference and it is now in Proceedings of IS&T The Engineering Reality of Virtual Reality 2020, and you can read our publications at arXiv.org.
In the summer of 2017, I was in an joint internship program at the Bodylogical project team. Our goal was to design an Augmented Reality (or Mixed Reality) data visualization tool for population health management. It is an early stage open project and we’ve been through many different iterations of prototyping, including prototypes that are designed toward professional data analyst and toward individual user who has health issues and needs concern if the data reflect such fact. Reviewing all the prototypes that we once have tried, I think it would be interesting to bring out some of them here in this blog.
The very first prototype we implemented is to render a 3D bar chart in the augmented reality environment, with some basic features including loading files, selecting data and some manipulations of the chart.
Clearly this approach has some advantages of identifying outstanding data groups by picking out the towering bars, and also provides quick reference to the data group in certain year in certain area. But it turns out that there are many limitations in this kind of visualization and it does not provide anything new beyond what could already be shown on a regular Excel sheet.
Our second trial strives to make best use of the environment provided by the mixed reality. Instead of plainly lay out all the data information on a chart, we implemented “Embedded Visualization” based on the individual heath data from the population health. As a result, we crafted an experimental tool that targets toward individual user based on their health data. The tool will provide embedded visualization based on image and object recognition to show data relevant to certain object in regards to the health of an user.
We also attempted to use the environmental mapping feature provided by HoloLens to allow user change their surrounding environment by adding interesting augmented interior decorations in the room, so as to encourage user to accomplish certain daily activity that contributes to better health status.
An example would be that if the data indicate that the user’s lacking of exercising, this tool can generate a “Gym Room” environment by adding some virtual gym equipment and gym-style posters to simulate the scene in a gym. And by analyzing HoloLen’s motion data, we are able to detect some basic exercises like doing squats and reflect user’s exercising result in the visualization.
One of a more completed prototype that we created is a group data visualization tool, more sophisticated than our 3D bar chart example, that allow data analyst to actually analyze data like never before. The core component in this design is a multi-dimensional scatter plot that reflects six dimensions of data by making use of the 3 axes, color, trace line, and sphere size. The following picture would be a good example of this tool in actual use.
The user is able to select data file with certain format from the OneDrive cloud and load them into the scene, understanding data, making changes, identify certain groups of interest, and export them back to their desktop for further analysis.
Another function featured in this prototype is the shared experience, where multiple users (up to 5), each wearing a HoloLens, are able to analyze, discuss and manipulate the visualization together at the same place at same time.
From a first person’s view, clicking on the data allows the user to inspect the detailed data information of the selected individual. If a profile that needs further concern is identified, user can choose to save it onto a list that will be ready to export.
A fun fact is that this application has a dual language support for both English and Japanese in that we actually demoed this application in one of our conference in Tokyo.
We also integrated daydream controller into the HoloLens so that user is able to seamlessly switch between hand gesture control and the control using the daydream remote, which supports 3-DOF interaction, although the user experience is not quite ideal due to mismatch between the 6 DOF head movement and 3 DOF control.
Designing a reasonable data visualization tool and making best use of the mixed reality environment can be a real challenge, and HoloLens is by far the best platform that we can try testing the idea out. We also look forward to adapt some of them to a new platform when a better technology is available.
Hope you enjoyed this article :)