Visualizing the News with Watson Discovery

Jenny Woo
IBM Design
Published in
4 min readJun 10, 2019

This article is co-authored by John Carpenter, artist and spatial interaction designer at Oblong Industries.

Our ongoing exploration of the ways that IBM Watson® services can make sense of large datasets led us to a demonstration of exploring a live corpus of English language news stories from around the world. During a single weekday, an average of approximately 20,000 stories are fed into our system, the Discovery News application. In order to process all of this unstructured information quickly, the ability to understand language at scale is required.

The team’s goal for this project was to broaden people’s typical field of vision for news, to encourage curiosity, and to facilitate research within the ever expanding landscape of news. We wanted to invigorate the familiar routine of browsing a daily news feed while demonstrating the power of a Watson service, Watson Discovery News. One of the strengths of this service is its ability to analyze unstructured text to extract meta-data from content. This is made possible with embedded natural language processing capabilities. The service provided data such as: concepts, entities, sentiments, semantic roles, authors, publication dates, and relevant keywords with associated confidence scores. Visualizing news stories using this information help users find stories they might not have in a typical day of browsing the news.

The Immersion Room

The application is designed for a large scale 320° immersive room and visitors are invited to experience the news inside of it. Imagine you are controlling a UI that is an entire room, as opposed to a browser window. The software is built in Oblong’s g-speak spatial operating environment, and runs in real-time across 5 computers and 45 screens — which means that we’re driving 93 million spatially and temporally synced pixels. Another advantage of the spatial operating environment is that it provides true spatial tracking for interactions, allowing for input events to seamlessly pass (like 3D pointing or a 4D gesture) into the application’s environment. This is useful for both navigating a large complex spatial datasets and for interacting with the immersive, dynamic UI.

The Immersion Room is designed for collaborative group engagement. While one person drives the UI with a spatially tracked ultrasonic wand, additional inputs such as mobile devices, are integrated into the system to conduct searches within the application. The wraparound space, dynamic interface, and ease of discovering new content has been found to facilitate discussion and generate insight into the day’s events amongst our visitors.

Data Visualization and UI

Given this large scale form factor and a very real possibility of visitors not knowing where to focus upon entering the space, we decided to anchor the stories to their positions on the globe using locations extracted from articles. By plotting points geographically, we were able to see concentrated areas of news activity and identify locations of high activity over time.

Next we extracted concepts and entities, semantic roles and relations, along with sentiments from the daily feed of news stories. These relationships are mapped and visualized to reveal the rate of co-occurrence between topics as well as the average sentiment. The more frequently we see a concept or entity in the news, the more prominent it becomes in the UI. Very simply, the frequency of those topics and the positive and negative feelings associated with them gives us a quick glance at what a given day’s news is about.

A simple control panel allows users to easily explore this rich live dataset. The flexible UI accommodates a variety of views as users filter on concepts and sentiment, see high activity spikes, explore global reaches of concepts and drill down to a single news story. Users are encouraged to fully explore and immerse themselves in the room-sized visualizations. When designing and building this application, the team opted to allow users free rein to go inside of and even pass through the globe, progressively revealing new concepts and meta data as they move closer and closer.

UI elements unfold in the space to facilitate exploration. When a concept is selected the control panel opens up to a related concepts diagram, which opens up to a scrollable grid of filtered news stories, which in turn launch a real-time web view of the current story. With this system, users can quickly navigate from 20,000 to just a few stories of interest with a couple of gestures and clicks.

The highly dynamic UI of the News Discovery application provides users with a myriad of ways to explore different facets of an enormous body of aggregated news stories; inspiring a vision of how quickly teams can use this Watson service to find patterns and trends on topics they care about.

This project was a collaboration with the IBM Immersive Experiences team and Oblong Industries. It can be viewed in a Watson Experience Center.

Jenny Woo is a Design Lead of Watson Immersive Experiences at IBM, based in New York City. The above article is personal and does not necessarily represent IBM’s positions, strategies or opinions.

--

--