PC Magazine
Published in

PC Magazine

Inside the Cutting-Edge Data Technology Behind the US Open

IBM took us inside the cloud services and predictive analytics behind the tournament—and showed off the Watson power behind IBM SlamTracker and new Cognitive Highlights.

By Rob Marvin

Data is a huge part of what makes sports so interactive. It has evolved from baseball card stats and marking scorecards to SportsCenter and fantasy sports, and now mobile apps and real-time stat tracking with predictive analytics. Sporting events such as the US Open Tennis Championship have had to evolve along with these trends.

IBM has been the US Open’s technology partner for more than a quarter century, but this is the first time Watson was invited to the party. IBM gave PCMag a behind-the-scenes tour of its data command center at the USTA Billie Jean King National Tennis Center in Flushing Meadows, New York, to see how it’s equipping everyone at the stadium with real-time player and match data, from broadcasters and line judges to statisticians and fans — all while using Watson cognitive services to improve different aspects of the fan experience.

The US Open website and mobile apps for Android and iOS are already running on IBM’s cloud infrastructure, and there are a handful of other Watson application programming interfaces (APIs) from the IBM Bluemix hybrid cloud platform integrated in different ways.

At the 2017 US Open, the company announced a new solution called IBM Watson Media. Featuring a new AI tool called Cognitive Highlights, it analyzes match videos with machine learning and cut the best clips into highlight videos. IBM also rolled out a revamped mobile app featuring significant natural language and conversational improvements to its Cognitive Concierge chatbot, which I tried out on iOS.

Then there’s the Speech-to-Text API that automatically generates subtitles for all the video clips and interviews from the tournament. And the Visual Recognition API runs facial-recognition analysis on every photo taken by USTA photographers, and tags players and celebrities.

IBM packed more Big Data and predictive analytics into the tournament app as well with IBM SlamTracker analytics, which pulls in both real-time and historical player, match, and tournament data to generate responsive data visualizations predicting the outcome of sets, break points, and other pressure situations based on data patterns. This year, SlamTracker is even analyzing player and ball position data.

Real-time data, cloud-computing power, and Watson’s cognitive services are pulsing under the surface throughout the US Open, but here are the coolest examples we found while wandering around behind the scenes.

The newly announced IBM Watson Media solution comes with all sorts of bells and whistles. It performs metadata content searches and makes AI-informed recommendations on video content, along with speech-to-text analysis for closed captioning and a “spotlighting” feature to identify violent or adult content that might require further screening.

At the Open, Cognitive Highlights identify each match’s most important moments by analyzing historical data, crowd sounds and reactions, and players’ facial expressions. Watson then ranks and auto-curates the highlights for the video production team and cuts them into highlight packages.

IBM piloted the tech at this year’s Master’s golf and Wimbledon tennis tournaments, but the US Open marks a public partnership with the US Tennis Association (USTA) and IBM Watson Media’s coming out party as a business product. Each day during the tournament, the USTA will post Watson’s Highlight of the Day on its Facebook page, and fans who mark favorite players on the app will get push notifications with player highlights cut together by Watson Media.

The player and match stats and graphics on these screens are being fed in real time to the broadcasters and commenters announcing each match in the broadcast booth. If John McEnroe throws out an interesting stat about Andy Murray’s past US Open performances over the past 10 years, it’s because this guy sent him the data.

There are plenty of monitors in the command center, but no racks of servers. Everything runs through IBM Cloud, which has a multi-active architecture of seven public and private data center locations to scale up or down based on tournament activity. This also supports Watson for Cyber Security, which helps security analysts monitoring US Open digital platforms for hacks and breaches by mining unstructured data and cybersecurity research for proactive threat detection and endpoint protection.

Workers throughout the stadium, from line judges and stadium personnel to statisticians and media correspondents, can access real-time scores, serve speed radar, in/out ball locations, and a host of other metrics throughout the game using these IBM-powered USTA tablet apps, which aggregate sensor data from around the court.

Watson’s Visual Recognition API processes every official photo taken and uploaded to the USTA’s publishing tool, and scans it for facial recognition to quickly identify the players on the court and scan for any celebrities in the stands before the photo is officially published.

USOpen.org and the mobile apps both offer on-demand videos of match clips, recaps, and pre- and post-match interviews. The Watson Speech-to-Text API eliminates the need for a transcriber by automatically generating captions as soon as a video is uploaded to cut down on the delay before publishing it in the video section. The USTA still needs to edit the transcripts, but Watson is learning more player names and tennis terms as the tournament progresses.

SlamTracker’s predictive analytics is all about situational analysis. The Watson Machine Learning technology generates three “Keys to the Match” for each player based on eight years of Grand Slam tennis data, combined with real-time data on stats like aces and percentage of wins on first serve to predict probabilities in a situation like a fifth set tiebreaker. It even factors in player style models.

This year, the player and ball position data gathered feeds into pace-of-play analysis and proximity measures of how close a ball is to the baseline.

Watson’s Natural Language API is a good, quick concierge on the ground in Flushing Meadows. The chatbot has nixed the beta tag it carried last year, and now processes thousands of messages and responses throughout the course of the tournament. You can use it for everything from finding food or getting quick match data to getting directions to the nearest stadium bathroom.

This interactive demo area outside Arthur Ashe Stadium is more or less a test bed for all the ways IBM is experimenting with Watson, machine learning, the Internet of Things, and Big Data in real-world applications. If you’re spending a day at the Open, the demos are a fun way to kill some time between sessions.

Using IBM Bluemix, the Watson IoT platform, and Node-RED, an open-source tool for connecting hardware devices to online software and services, you can use brainwaves to move a connected BB-8 toy. Put on a headset and summon your mental powers to roll the little guy across the miniature court.

Using an adjusting stationary bike and an Oculus Rift headset, IBM simulates how Watson gave ultracyclist David Haase real-time race insight during a biking marathon. The technology uses IBM Watson Analytics, IBM Predictive Analytics, and Watson IoT services to monitor environmental factors like weather and incline. I gave it a try…the simulation didn’t go easy on the resistance.

Watson tries its hand at composing, creating a rhythm based on US Open sounds — bouncing balls, crowd noises, and squeaking sneakers — input from a drum pad. Then Watson layers some music on top to create an original track. This was the most gimmicky demo of the bunch. My song sounded pretty terrible…but most of that was probably due to my terrible drumming skills.

This system built on Bluemix uses Watson to pull in player info, team stats, and league-wide analytics to draft a team. The interactive series of displays the demo showed off was full of fake team names, but real sports organizations like the Toronto Raptors are actually using the solution.

In a tool called Maker’s Mosaic, Watson analyzed my Twitter profile to identify personality traits like agreeableness, introversion, and openness, and correlate them with a professional tennis player. The personality section was pretty dead-on, but after tabbing over to the values and needs sections and seeing just how deeply Watson probed my tweets to develop a complex and eerily accurate personality profile. I’ll admit, I was a little creeped out.

Pepper is a physical robot used to demo Watson’s natural language processing, conversational abilities, computer vision analytics, and a host of other cognitive services as IBM trains Watson using deep learning techniques. The goal, the company said, is to build a whole middleware layer around embodied cognition. For now, Pepper can hold a pretty decent conversation, use body language to show emotions like surprise and celebration, and give out high-fives.

Read more: “IBM Brings Watson to Mobile Device Management

Originally published at www.pcmag.com.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store