First Steps articles are brief introductions to the PEACH projects.
After the usual Christmas slowdown, the PEACH teams are back in their labs, beavering away. In our inaugural “First Steps” story , the augmented reality team (PEACH Reality) give us a brief teaser of their project. Welcome to the hot seat Tim, Laura and Fraser. Tell us about what you have been up to.
Hi there. We are a group of 2nd year computer scientists from University College London (UCL) working on HoloLens in medicine. At the moment virtual and augmented reality development is mainly driven by the personal entertainment industry — but people have begun to look at other applications in fields like architecture, robotics, and of course medicine.
In our case, we are very interested in the value that augmented reality can add to routine medical imaging, particularly when preparing for complicated surgery. Most medical scans are displayed as 2-dimensional images, which are not very good at revealing the complexity of 3-dimensional anatomy.
The challenge has been to come up with a seamless pipeline from the raw CT scan data to holographic patient cases, wrapped in an intuitive user interface. There are many challenges!
We are working with the Translational Imaging Group of UCL, which kindly provided us with a deep learning system for extracting 3D meshes of body structures from CT scans. The time taken to do this extraction is constantly improving as the project develops. The PEACH Reality project wraps this system in a set of tools that make up a platform for studying and sharing holographic patient cases.
One of our key aims is to make the user experience as straightforward and unambiguous as possible. To achieve this, we’re designing a web app, an API and a Microsoft HoloLens application, which provide the user with means of completing different steps in the process of creating a holographic patient case. We hope to cater for users of varying levels of computer literacy by designing user-friendly interfaces and automating most of the non-essential tasks.
The web app is the entry point to our system. It allows users to view, create and modify existing holographic patient cases, as well as invite their colleagues to collaborate on certain cases. It communicates with the API, which does all of the heavy lifting. It handles security, processes uploaded files, extracts body structures to 3D models and optimises models to improve performance.
Finally, holographic patient cases can be opened using our HoloLens application. In there, users can view and interact with models and raw CT images in a mixed reality setting, annotating certain parts by attaching voice recordings or text notes to them. The changes can later be uploaded back to the API and exported using the web app.
Ultimately our system would ideally be directly integrated into the existing hospital infrastructure (ie the imaging data store, or PACS). But that is a long-term goal, building on user feedback and working to regulatory requirements.
In the meantime, we plan to produce sample tools and whitepaper specifications to help future developers understand the challenges we encountered and the solutions we came up with.
The current PEACH Reality team are:
Tim Kuzhagaliyev - https://github.com/TimboKZ
Laura Foody - https://twitter.com/lbf_l
Fraser Savage - https://savage.engineer
We hope to hear more from the PEACH Reality team and the other PEACH teams as the projects develop. Please let us know your thoughts in the comments below.
To be kept informed of our progress, please follow us on Medium.