Skip to main content

Watson tracked every serve, set, and save to show you the best of the U.S. Open

Last Thursday, 19-time Grand Slam champion Roger Federer was taken to the limit in a five-set thriller against Mikhail Youhzny at the 2017 U.S. Open.

As the action played out at the USTA Billie Jean King National Tennis Center, Digital Trends was invited into the heart of the Arthur Ashe Stadium to see how IBM used its cutting edge technology to track every serve, set, and rally.

IBM has been a fixture at the U.S. Open since 1990, and the company’s ability to give fans better access to the biggest matches has evolved with every passing year. This year, artificial intelligence, computer vision, and a host of other technologies came together to curate the best play from each day of the tournament.

Brad Jones/Digital Trends
Brad Jones/Digital Trends

Yet the company’s plans don’t end at the U.S. Open. It has plans to bring the lessons Watson learned while tracking every move of a tennis match to purposes both fun and practical.

Watson is Watching

The U.S. Open is a massive tournament. Between men’s and women’s singles and doubles competition, as well as mixed doubles, an incredible amount of tennis is played over its two-week duration.

Watson analyses numerous data points from each match, selects the most engaging moments, and makes them available online

If you’re in attendance at the National Tennis Center, it can be difficult determine which court is going to offer up the most exciting matches. Even if you’re at home, you might find yourself jumping between streams. There’s simply too much happening for any one person to take in.

IBM is using Watson to make sure tennis fans don’t miss a thing. Its Cognitive Highlights capability analyses numerous data points from video of each match to select the most engaging moments, and make them available online and via the U.S. Open app shortly after they take place.

“Think of seven courts worth of video, from matches going on at the same time,” said John Kent, IBM’s program manager for global sponsorship marketing. “To try and create highlights for each of those matches, and push them out on mobile, would require an army of people.”

IBM does have a significant presence at the U.S. Open, but it’s not exactly an army — perhaps fifteen members of staff are in the bunker at any given time. Watson does the heavy lifting.

The Bunker

When I arrived at the National Tennis Center last Thursday, I was whisked past various security checkpoints into the underbelly of the Arthur Ashe Stadium, dodging pro players as they came and went from locker rooms to matches, or back. It made sense for IBM’s base of operations to be in the heart of the venue, as it needed to be hooked to an incredible array of hardware scattered across the complex, yet also cause minimal disruption. The sensors kept track of weather conditions, watched player movements, and even gauged the reaction of the crowd.

The bunker was packed with hardware, from the displays that lined each wall with stats from the day’s competition, to stacks of server blade supplying pure processing power, to individual workstations where engineers made tweaks and ran diagnostics. IBM’s hardware set-up brought all the different feeds of information together, and allows Watson dive into the data.

Brad Jones/Digital Trends

This kind of analysis is where Watson comes into its own, using computer vision to pore over the video feed and find the highlights. If an on-court sensor records a blistering serve, Watson can pinpoint that part of the broadcast footage. However, its curation abilities become smarter when computer vision comes into play.

IBM taught Watson to recognize players making a fist pump motion, which tennis players do a lot, as they typically have a racquet in their other hand. Yet there was a problem — fist pumps celebrating a well-earned point were getting confused with clenched fists of frustration. To clarify things, Watson was instructed to cross-reference the sight of a fist with the level of crowd noise, to make sure it was a great play as opposed to a bungled serve.

All Access

Cognitive Highlights is great for fans who want to soak up the best action over the course of the tournament, whether they’re fortunate enough to be there in person, or watching remotely from anywhere in the world. However, IBM’s efforts to process footage from the tournament also benefits pro players.

The same tech could index a podcast, so that listeners can decide which topics are worthy of their time.

Whenever players do battle on one of the National Tennis Center’s seven show courts — where televised matches take place — they’re given access to footage straight after the final set. They can log in to a special web portal, or request a USB key loaded with the content.

“There are specific players and agents that come looking for that after their match,” said Kent. “We presume if they’re coming for it, they’re going to use it.”

This isn’t just a recording of the match as it was broadcast. Numerous events and occurrences are indexed, making it easy for players and coaching staff to pinpoint exactly what they’re looking for without trawling through hours of video.

“They’ll be able to look at their unforced errors, look at their winners, look at their aces, look at their double faults,” explained Kent. “So whichever aspect of the game they’re looking to analyze, it’s a tool that helps them to more rapidly go through their match footage.”

Dark Data

Whether you’re a fan, a player, or a video editor working on behalf of the USTA, IBM is looking to make it easier to sift through content. The company has plenty of ideas for making continued improvements.

Kent told me about Watson’s capacity to shine a light on dark data, the useful information that’s locked away in video, audio, and other content, which computers can’t parse easily. He offered up a scenario that isn’t currently part of IBM’s tennis coverage, but is technically possible.

A player interview is recorded on camera, then subtitled using a text-to-speech program, which produces a transcript that’s exported as a text file. This is a good thing in of itself, as it makes that content more accessible to people with disabilities. However, Watson can use these materials to dig deeper into the content for indexing purposes.

“We can analyze that transcript and find out more about the context of what that player was talking about, and add more specific tags about that video,” said Kent. A video clip wouldn’t just be labelled as a Federer interview, it would be tagged with information about what the player was discussing. That way, if you were looking for competitors’ opinions on playing with the roof of the Arthur Ashe Stadium closed, for example, you could quickly find those exact portions of different interviews.

IBM isn’t just exploring how Watson can be used to sift through content for the sake of curiosity. The endgame is Watson Media, a suite of tools and services that’s set to be offered on the IBM Marketplace later this year.

What’s On, Watson?

IBM works closely with the USTA to implement Watson at the U.S. Open, but Watson Media will give news organizations, movie studios — and whoever else wants it — access to IBM’s content analysis tech. It’s turnkey access to incredibly potent technology, ranging from the ability to parse through text transcripts, to the use of computer vision to analyze video footage.

“Stitching those things together, that’s something that developers can do on their own, using those tools and services that are enabled on the platform,” said Stephen Hammer, chief technical officer for IBM Sports.

Brad Jones/Digital Trends

This technology could change how people organize content. Hammer gave the example of a movie studio giving subscribers better access to individual portions of different films. Working your way through a break-up? You could ask to see only the most emotional scenes from all the romantic comedies and dramas on hand. Need to psyche yourself up ahead of the big game? You could search for ‘inspirational sports speeches’ and enjoy a playlist of the most powerful addresses from coaches, mentors, and former champs ever committed to celluloid.

He also put forward a possible application for reporters and journalists, who might have an enormous back catalogue of images and video to supplement their work. The tools that comprise Watson Media could underpin a system that gives reporters access to relevant content. Just ask for a picture of a boat, and Watson could offer every ship on file. Need a cargo boat, or a racing boat? Watson can work with that, too.

Such capabilities are not entirely unheard of, but now IBM is packaging it and selling it, which means more people can use it. Those looking to use cutting-edge AI won’t need to develop the tools from the ground up; they’ll have access to tried-and-tested services, tools that have already been put through their paces in high-pressure scenarios — like broadcast coverage of the U.S. Open.

Watson Media will give news organizations, movie studios — and whoever else wants it — access to IBM’s content analysis tech.

“More data is being created daily,” observed Kent. “There’s structured data — we know how to deal with that, we’ve dealt with it for a long time — but then there’s unstructured data, all those videos, all those texts, all those blogs.” Unless it’s processed, searching through this kind of content can be like trying to find a needle in a haystack. Watson can digest it, organize it, and make it easy to find exactly what you’re looking for.

Think back to Watson’s ability to use computer vision to recognize a player’s fist pump, checking that it’s assumption is correct by listening to crowd noise. This is a system that can look and listen — and crucially, it can do so at scale.

The same tech could run an automated search for common lines of dialogue that were shared between movies released in the 1990s, or analyze which advertising banners saw the most screen time over a season of the Barclays Premier League, or index a podcast, so that listeners can decide which topics are worthy of their time.

Watson Media will help us organize, categorize, and analyze huge quantities of audio, video, and text with greater efficiency than ever before. If developers can wield this power properly, it’s sure to have a big impact wherever it’s implemented.

Brad Jones
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
The best portable power stations
EcoFlow DELTA 2 on table at campsite for quick charging.

Affordable and efficient portable power is a necessity these days, keeping our electronic devices operational while on the go. But there are literally dozens of options to choose from, making it abundantly difficult to decide which mobile charging solution is best for you. We've sorted through countless portable power options and came up with six of the best portable power stations to keep your smartphones, tablets, laptops, and other gadgets functioning while living off the grid.
The best overall: Jackery Explorer 1000

Jackery has been a mainstay in the portable power market for several years, and today, the company continues to set the standard. With three AC outlets, two USB-A, and two USB-C plugs, you'll have plenty of options for keeping your gadgets charged.

Read more
CES 2023: HD Hyundai’s Avikus is an A.I. for autonomous boat and marine navigation
Demonstration of NeuBoat level 2 autonomous navigation system at the Fort Lauderdale International Boat Show

This content was produced in partnership with HD Hyundai.
Autonomous vehicle navigation technology is certainly nothing new and has been in the works for the better part of a decade at this point. But one of the most common forms we see and hear about is the type used to control steering in road-based vehicles. That's not the only place where technology can make a huge difference. Autonomous driving systems can offer incredible benefits to boats and marine vehicles, too, which is precisely why HD Hyundai has unveiled its Avikus AI technology -- for marine and watercraft vehicles.

More recently, HD Hyundai participated in the Fort Lauderdale International Boat Show, to demo its NeuBoat level 2 autonomous navigation system for recreational boats. The name mashes together the words "neuron" and "boat" and is quite fitting since the Avikus' A.I. navigation tech is a core component of the solution, it will handle self-recognition, real-time decisions, and controls when on the water. Of course, there are a lot of things happening behind the scenes with HD Hyundai's autonomous navigation solution, which we'll dive into below -- HD Hyundai will also be introducing more about the tech at CES 2023.

Read more
This AI cloned my voice using just three minutes of audio
acapela group voice cloning ad

There's a scene in Mission Impossible 3 that you might recall. In it, our hero Ethan Hunt (Tom Cruise) tackles the movie's villain, holds him at gunpoint, and forces him to read a bizarre series of sentences aloud.

"The pleasure of Busby's company is what I most enjoy," he reluctantly reads. "He put a tack on Miss Yancy's chair, and she called him a horrible boy. At the end of the month, he was flinging two kittens across the width of the room ..."

Read more