Immersive data visualization: A brief update on our Journalism 360 grant

Óscar Marín Miró
journalism360
Published in
5 min readJan 1, 2018

In March 2017, Knight Foundation, along with partners Google News Lab and the Online News Association, launched the Journalism 360 Challenge to advance the use of virtual reality and other immersive storytelling approaches in the field of journalism.

Happily, our project Dataverses was one of the grant winners. The project’s aim is to help the integration of data visualizations into immersive storytelling through an easy platform, along the lines of TimelineJS. I’d like to thank Journalism 360 for this big opportunity and congratulate all the other grantees! You can read about the Challenge and the winning projects here.

How does the tool work?

The tool reads a public Google spreadsheet with three tabs:

  • Scene definitions. You enter each specific data visualization or media visualization (photo or video). In each entry you can customize some specific parameters (e.g., a link to the dataset used by the data visualization or the media URL).
  • Link definitions. You specify links for traveling from one scene to another and position them in 3D space. The viewer will move from scene to scene just by gazing at these hotspots.
  • Label definitions. You can add some spatial labeling, useful for drawing attention to a scene, and label some “zones” of a 360 photo or video.
An example of scene definitions

The tool is implemented in WebVR, using the wonderful A-Frame library, thus making the experience available on almost every VR device (Cardboard, Vive, Oculus, Daydream, desktop browser). The code will be open sourced and is being written with a modular approach in mind, so the community can code and incorporate more data visualization types.

When building an experience, the journalist/storyteller will just have to create a Google spreadsheet with all the definitions and share its URL into a web page that will generate a permalink of the experience so it can be shared.

The following features have already been implemented (although a lot of work is still needed to polish the interaction and user interface).

Network visualization

A network data visualization shows relationships among entities. You can import network data from a Gephi file. In room-scale devices, you can “walk” inside the network. A third dimension is added to the network data, so you can take advantage of this in the representation of the network. You can also gaze at and interact with the nodes.

A network representation with an extra dimension

Geographic visualization

You can display GeoJSON files on an Earth sphere, and attach color and text information to each geographic feature. I’m also working on street-level maps for visualizations at city scale.

Earth-wide geographic data visualization

Timeline visualization

Pointing to another spreadsheet with well-formatted timeline rows (dates, titles, media and text), you can build an interactive timeline displayed as a semicircular shape with events popping up (photos, Vimeo videos or plain text) when the viewer gazes at each timeline milestone.

A timeline visualization can also be used as a menu to provide brief text about sections of the experience where the viewer can navigate to each one without leaving VR.

A timeline visualization of the example provided in TimelineJS

Isotype visualization

Simple Isotype visualization showing the world population distribution

Based on the concept of Neurath’s Isotypes, I’m experimenting with different ways to port data models into VR. For each row of data displayed, you can specify a 3D model imported from Google Blocks or just point to gITF, COLLADA or OBJ models. If the format permits, you can also set the color of the models.

360 photos or videos

In the scene definitions sheet, you can point to media files representing 360 photos or videos. Currently, 360 photos and 3D/single-eye 180/360 videos are supported. In the data sheet, you must specify the format of the video (3D/single-eye, 180/360, horizontal/vertical split).

An example of a 360 photo added with just a row in the master scene sheet

“Joyplot” visualization

A “joyplot” visualization shows different distributions of data rows in 3D space. You can walk through the data, highlight a specific row or “claim” the raw data. The example below is based on “Sports vs time of day” by Henrik Lindberg.

A “joyplot” visualization showing the distribution of frequencies at different times of day for different sports

Bottom menu / media player

The tool incorporates a bottom menu button — deployed when the viewer gazes at it — that offers “back” and “home” browser functionalities inside VR and, when applicable, a media player.

The bottom menu deployed, showing a media player. Image is © Hirikilabs/Tabakalera.

Labels

In the label definitions sheet, you can place labels for each scene in 3D space. In the example below, you can see a test label that, when gazed at, shows text explaining that specific aspect of the experience.

An expanded spatial label. Photo is © Hirikilabs/Tabakalera.

Links

In the link definitions sheet, you can place links between scenes in 3D space and specify a thumbnail image, text and 3D location. When the viewer gazes at the thumbnail, he/she will travel to the destination scene without leaving VR.

An example showing a link between scenes in a 360 photo. Photo is © Hirikilabs/Tabakalera.

Next steps

There’s still a lot of work remaining. Next developments include:

  • Refining the user interface—every data visualization, every interaction and the whole experience
  • Building city-level geographic visualizations
  • Enhancing power to tell stories and display information with Neurath’s Isotypes
  • Creating nice transitions between scenes and making the user well aware when a scene is loading
  • Completing a couple of production examples to help users understand the tool
  • Building a website to showcase the tool, the examples and detailed help and troubleshooting

In 2018 we’re going to organize hackathons to demonstrate the tool and facilitate productions. If you’re interested in this (or anything else related to this tool), just contact me via Twitter.

--

--