Devices All The Way Down

A look back at my NSF CAREER project.

Photo by Hal Gatewood on Unsplash.

Good news sometimes strikes in the most unlikely moments. In November 2012, as my wife and I were out at a museum in Indianapolis, she had the misfortune of having to witness me strutting out of a public restroom with my arms raised like a returning conqueror. I had just received news that my NSF CAREER proposal had been awarded.

So it was with a bittersweet feeling that I just, almost eight years later, submitted the final report for my NSF CAREER project: “Ubilytics: Harnessing Existing Device Ecosystems for Anywhere Sensemaking”. While not my first NSF grant, and while I had spent nearly four years as a faculty member at Purdue by that time, that project kickstarted my career in more ways than one.

Proposals are useful things. Sure, the most immediate and most obvious benefit is the off chance that they yield you money to actually carry out the work you are proposing to do. However, a less visible benefit — and one that is not dependent on you actually landing the grant — is that they are mental devices to help you plan your future research.

I’ve never been very good at the whole world domination thing: sitting down to plot a strategy for world domination, like Julius Caesar, Jeff Bezos, or Frank Underwood in House of Cards. Instead, I tend to have a relatively short-sighted, one-deadline-at-a-time look upon my research. Having to write a grant proposal forces me to raise my gaze from what is right in front of me to sometimes several years into the future. In the case of the NSF CAREER program, that is at least five years (in my case seven) into the unknown. For this reason, my Ubilytics project was extremely formative to my thinking about my own research agenda.

When submitting a final report, the NSF asks you to also submit a 200–800 word document on the outcomes of the work in an approachable way. As I did for my recently completed C3DaR grant, I decided to publish this outcomes document here as well. Below follows that narrative.

The Ubilytics project posed the fundamental question of how we can use the existing ecosystem of networked devices in our surroundings to understand and exploit massive, heterogeneous, and multi-scale data anywhere and at any time. Assembling these devices into unified sensemaking environments will enable deep analytical reasoning in the field, such as managing heterogeneous data in scientific lab notebooks, scaffolding undergraduate learning, and supporting investigative reporting by linking facts, findings, and evidence. On a higher level, this concept would stimulate our digital economy by supporting fields such as design and creativity, command and control, and scientific discovery.

However, despite this ready access to myriad devices both small — smartphones, tablets, and smartwatches — and large — desktop computers, multitouch displays, and mixed reality headsets — each individual device is currently designed to be the focus of attention, and cannot easily be combined with other devices to improve productivity. Another limiting factor is that interfaces and visual representations are typically designed for a particular form factor, and adapt poorly to new settings when migrating between devices. Finally, the computational and storage resources of most mobile devices are insufficient for the complex analyses necessary as well as for persistently storing session state across fluctuating ensembles of devices.

Ubiquitous analytics — or ubilytics — is a comprehensive new approach for harnessing these ever-present digital devices into unified environments for anywhere analysis and sensemaking of data. It draws on human-computer interaction, visual analytics, and ubiquitous computing as well as a synthesis of distributed, extended, and embodied cognition. The latter challenge traditional cognitive science by insisting that cognition is not limited to the brain, but also involves the body, the physical world, and the sociocultural context. Instead of studying cognitive aids in isolation, ubilytics therefore takes a system-level view of cognition that engages different representational media — such as humans, physical artifacts, mobile devices, and large displays — as well as interactions that are used to bring these media in coordination with each other — such as verbal and gestural cues, touching and sketching, partitioning and arranging, and note-taking and annotation. Thus, ubilytic environments benefit sensemaking by distributing cognitive aids in space and time; by off-loading memory, deduction, and reasoning; by harnessing our innate perceptual, cognitive, motor, spatial, and social skills; and by multiplying interaction and display surfaces.

The research in the ubilytics project revolved around three themes: (1) universal interaction, (2) flexible visual structures, and (3) efficient distributed architectures. Below I give examples of ubilytics research projects that span all three themes.

Central to ubiquitous analytics is Vistrates, a component model and a literate computing platform for developing, assembling, and sharing visualization components. Built on top of the Webstrates and Codestrates open source projects, Vistrates features cross-cutting components for visual representation, interaction, collaboration, and device responsiveness maintained in a component repository. The environment is collaborative and allows novices and experts alike to compose component pipelines for specific analytical activities. This allows for easily creating cross-platform and cross-device visualization applications on any device capable of running modern web technology, as well as integrating existing web-based visualization resources such as D3, Vega-Lite, and Plot.ly into these applications.

  • Sriram Karthik Badam, Andreas Mathisen, Roman Rädle, Clemens Nylandsted Klokmose, Niklas Elmqvist. Vistrates: A Component Model for Ubiquitous Analytics. IEEE Transactions on Visualization & Computer Graphics, 25(1):586–596, 2019. [PDF]

With Vistrates as a foundation, we looked at the practicalities of actually distributing different views of a visualization application across multiple devices in the Vistribute project. By characterizing each view based on its preferred size, position, and relation to other views, Vistribute automatically calculates a layout across the available display surfaces in a ubiquitous sensemaking environment. This layout can change as devices enter and leave the environment, such as when powering up a laptop at a meeting, or pocketing a smartphone that is no longer needed.

  • Tom Horak, Andreas Mathisen, Clemens Nylandsted Klokmose, Raimund Dachselt, Niklas Elmqvist. Vistribute: Distributing Interactive Visualizations in Dynamic Multi-Device Setups. In Proceedings of the ACM Conference on Human Factors in Computing Systems, paper no. 616 (13 pages), 2019. [PDF]

Looking to the future, I have increasingly been investigating how to use ubiquitous analytics for situated sensemaking, such as when in the field and on the go. Several projects have contributed to this development. In There Is No Spoon, collaborators and I deployed the ImAxes virtual reality system for multidimensional analysis at a U.S. federal agency for an entire year and observed how economic analysts and data scientists interacted with it for their own datasets. In VisHive, we studied how to build opportunistic and ad-hoc computational clouds using only web-based technology and mobile devices. And in ongoing work we are making a foray into mixed and augmented reality to support situated visualization for ubiquitous analytics in the recently funded DataWorld NSF project.

  • Bruce Thomas, Yvonne Jansen, Aurelien Tabard, Pierre Dragicevic, Niklas Elmqvist, Pourang Irani, Dieter Schmalstieg, Gregory Welch. Situated Analytics. In Immersive Analytics, Lecture Notes of Computer Science, No. 11190, Springer, 2018.
  • Andrea Batch, Andrew Cunningham, Maxime Cordeil, Niklas Elmqvist, Tim Dwyer, Bruce H. Thomas, Kim Marriott. There Is No Spoon: Evaluating Performance, Space Use, and Presence with Expert Domain Users in Immersive Analytics. IEEE Transactions on Visualization & Computer Graphics, 26(1):536–546, 2020. [PDF]
  • Zhe Cui, Shivalik Sen, Sriram Karthik Badam, Niklas Elmqvist. VisHive: Supporting Web-based Visualization through Ad-hoc Computational Clusters of Mobile Devices. Information Visualization, 18(2):195–210, 2019. [PDF]

Beyond the above sample projects, my Ubilytics CAREER project yielded a total of 14 publications in well-regarded journals and conferences such as IEEE InfoVis, ACM CHI, and IEEE VAST. At least four students were funded on the project over the years, and one — Karthik Badam, the architect behind Vistrates — completed his Ph.D. on this exact topic. While it is perhaps too early to assess the impact of the project on the scientific community as a whole, it doubtlessly had a transformative impact on at least one individual: me. I thank the NSF, the anonymous panelists, and — ultimately — the U.S. tax payers for the opportunities this grant gave me.

--

--

Niklas Elmqvist
Sparks of Innovation: Stories from the HCIL

Professor in visualization and human-computer interaction at Aarhus University in Aarhus, Denmark.