Since I began working with Watson, I’ve always been interested in how it works with soft sciences: fine art, design, film, and music.
I had the opportunity to work on some initial multimedia work on the TED & Watson partnership. It got me thinking — what experience would I like — what media property do I like that I can spans across written, visual and film media.
Teaching Watson to read comics
Tracing comics and reading superheroes were my first art school — only makes sense. I knew I needed a comic that was dense, has a large cast, rich historic background information, and adapted to film or TV.
I decided to use one of my favorite comics, The New Frontier written and drawn by Darwyn Cooke, colored by Dave Stewart. The comic has a rich narrative spanning decades, beautifully illustrated and meticulous in its execution, as well as a unique instance where a comic has a real-world setting with true historic context. We also pull from story background, insights, conceptual notes, and motivations from the creator’s own words and interviews — something particularly interesting in regards to art. It enables Watson to guide the user with the artists’ own perspective.
The focus was to create an experience that allows a user to see something they love in a new way. As they read the comic, they can explore the tones, themes, and concepts of each panel and page on a second screen. Watson can pull in contextual data from disparate sources and let them get a better understanding of the complicated history and tones within the story.
See the work in a new light
They can explore The New Frontier by focusing on characters — seeing motivations, personalities, and interconnections of each. You could explore by emotion and sentiment as well as visual trends — colors, composition, and more.
The idea is to allow the user to rediscover their favorite work — maybe there are trends in visual storytelling that creators do that are harder to see in a page-to-page experience. That is what interested me in exploring this idea. Watson can introduce a sense of abstraction not observed in a linear reading of the story.
Tapping into the storytelling trends the reader can now see the tone and emotions, concepts and all associated content with every panel. Watson helps me see how the story evolves from page to page and panel to panel. Watson finds ideas that related to what they are viewing — whether the page or the panel.
By combining off the shelf Watson APIs like Natural Language Understanding, Tone Analyzer, Personality Insight and Visual Recognition, Watson can access the unstructured data within the comic panels themselves. We begin to build a knowledge graph based off the entire corpus of the comic, mapping characters, art, and concepts within the story. Using Watson Knowledge Studio, and Discovery Services the user can access concepts and themes within the story and begin finding associated ones from other sources like wikis and external content.
Watson analyzes the characters throughout the entire book and lets the user explore every character — they can explore the characters on their current page or study the whole book by characters. They can click to read more and get a brief overview of the characters personality and motivations.
Throughout the experience, Watson has connected the user to additional content. The user can view a video of the artist and creator discussing the particular page they are viewing or can browse the concepts and learn more about each one. The user can get a better understanding of the backstory of specific concepts or better familiarity with the world in context.
The power of Watson is particularly compelling when discovering insights in obscure or unknown topics and characters. The reader has a better context as to why a character behaves a certain way or how a historical moment is central to the story.
I branded this a cognitive exhibit in hopes that the simple demonstration will inspire people to look at the less obvious integration of Watson — building up core abilities in the space can create a softer and more approachable public face. Pulling together proprietary data sources and media Watson acts as a digital curator for companies that allows their fans to get closer and be more accessible.
I am very interested in how AI systems can enrich our non-technical lives. I wanted to explore how we can enable Watson to understand something as complex and nuanced as visual stories but also to challenge a non-technical person *me* to develop and validate a “real-data” prototype with Watson. The experimental POC proved an entirely new way Watson can add value and clarity to another form of unstructured data and defined a way an experience could be structured for numerous multimedia companies and institutions.
Rob Harrigan is a Watson Prototypes Design Lead at IBM based in Astor Place. The above article is personal and does not necessarily represent IBM’s positions, strategies or opinions. The POC above represents self-initiated, non-commercial work.
The New Frontier, characters, and content © 2017 DC Entertainment.