Introducing Embedding.js, a Library for Data-Driven Environments

Data and its visual presentation have become central to our understanding of the world, and yet so many visualizations prioritize bling over communication. The fear, and it is justified, is that VR will merely exacerbate the problem, unleashing new and nauseating ways to deliver empty visual calories rather than a meaningful increase in articulative power.

Yes, this is lame.
  1. We are not merely seeing and exploring these environments, but reaching out and manipulating them; we are not ghosts in the data machine, but actors participating in our surroundings.
  • It is responsive, so that any embedding environment will Just Work without modification on the desktop, with touch, and in WebVR.
  • Its compatibility and reach will track that of WebVR as a whole, which means that it will likely work on most major browsers by the end of 2017 (see here to track progress and get more info). Web VR (and therefore Embedding in VR mode) is already supported on Windows with Chrome and Firefox via special builds of those browsers.

Spatiotemporal Embeddings

The key concept and abstraction in the library is — wait for it — the embedding of a dataset in time and space. These embeddings can be static, or they can represent various temporal aspects of the data generation, gathering, and transformation processes. The most immediate example is a scatter plot embedded in three dimensions, in which each point corresponds to an observation in the dataset, and position and other properties are derived from attributes/columns/features of the data.

Why Immersion Matters

What is different about immersive environments, and how can those differences be used to transcend the limits of traditional visualizations in understanding data?

Conceptual metaphor

We humans have great capacity for abstract thought, yet we do so with brains that are, in terms of their physiology and functional organization, very similar to other mammals who do not seem to possess these abilities. This is because the human brain leverages neural structures and circuitry that originally evolved to handle spatial and social cognition for other, more flexible aims. Conceptual metaphor is the body of theory that describes how we use isomorphic mappings from concrete to abstract domains to reason about everything from long-term relationships to data structures. These mechanisms are not exceptions or special cases, but are ubiquitous in our thinking. The question for immersive data visualization is therefore not whether to use conceptual metaphors, but which ones to use and how to execute them.

Multisensory cues

Humans are visual creatures, but much of the richness of our everyday experience stems from the fact that our eyes, ears, skin, and other sensory organs provide different kind of information about the environment — and that, in normal situations, those signals are consonant with and reinforce one another. It is very likely that appropriate, spatialized sound design and haptic feedback can be leveraged to significantly increase the level of understanding provided by a data-driven environment, just as VR developers have discovered to be true in other application domains.

Motion parallax

Motion parallax: a little goes a long way.

Landmarks and navigational cues

I really feel like I’ve been to Castle Hohenrechberg — but it was just in


We know how this ends

Now and Next

Embedding.js is ready to play with, though much remains to be done. I’d like to invite the adventurous to jump in and create new environments right now. And I’d be positively thrilled if some of you were moved to help out and contribute — there’s a rough roadmap, but it’s just a sketch.

Perception nerd. We create the future; let’s make it a good one.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store