Using Gesture to Explore Early Heart Formation

USC Bridge Art + Science Alliance releases a new video documenting how researchers from the Keck School of Medicine are using a gestural interface and Oblong’s g-speak Spatial Operating Environment to explore early heart formation.

There exists a critical need for new tools and techniques to visualize, filter, interpret, and share high-dimensional image datasets. New sensors allow scientists to collect such massive amounts of information from events such as cell migration in cardiogenesis that it can be difficult to perceive and understand all the hidden patterns and narratives using traditional visualization strategies. John Carpenter (Oblong and USC School of Cinematic Arts MA+P) and Dr. Rusty Lansford (USC Keck School of Medicine) demonstrate that immersive visualizations and spatial navigation can play an essential role in understanding complex multidimensional datasets.

Image for post
Image for post
A collaborative, immersive, gesture-based data visualization of early heart formation for a 320° immersive room using Oblong’s g-speak spatial operating environment (93,312,000 pixels).

Using g-speak, Rusty and John designed a 320° immersive workspace that permits gesture-based navigation of volumetric image sets from early heart formation. The application runs in real-time on five computers over 45 screens (93,312,000 pixels), and the ultrasonic emitters and wand provide precise spatial interactions with the system. The raw data is multispectral 4D (xyzt) confocal microscopy image sets (11,088 images: 126 images every 7.5 minutes for 11 hours) that underwent quantitative analysis to generate a multi-dimensional data set for 460.6K points. The goal of the work was to enable the visualization of individual cell movements en masse while maintaining the integrity of the data and providing an intuitive new way to navigate and explore early cardiogenesis.

Gesture — which by its nature is spatial — is an ideal mode for interacting with multidimensional datasets because it provides a seamless, intuitive way to connect with and navigate the datascape. Placing the data in a room-sized (human-scaled) pixel space provides new opportunities for visualization and allows for exploration of the data from multiple perspectives at different spatial and temporal resolutions. The social nature of the space, and the ability to have multiple users driving the interactions with the system creates opportunities for new forms of scientific collaboration and the communication of research.

Image for post
Image for post
The same software that runs in the 320° immersion room can also run on our other systems (such as the 40' media wall), providing additional options for viewing and exploring the data.

John Carpenter and Dr. Rusty Lansford were recently awarded a grant from the USC Bridge Art + Science Alliance (BASA) to document their work. This video was directed by Lily Darragh Harty and produced by Kyle McClary in collaboration with Oblong, USC, and CHLA.

Image for post
Image for post (mirror)

Written by

Artist and designer whose work explores the use of gesture with complex data and spaces. He works at Oblong and teaches at the USC School of Cinematic Arts.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store