IBM will track every shot taken by every player at the PGA Tour’s Masters Tournament and use its artificial intelligence platform Watson to produce three-minute highlights of every round.
While IBM launched automated highlights for a select few players last year, the tech company will be expanding those capabilities (via an expanded partnership with the Masters broadcast partner CBS) to include all of the roughly 90 players at the 2019 Masters, which tees off on Thursday.
IBM analyzed roughly 4,000 shots at the 2018 Masters. It estimates that its system will end up tracking some 5,000 holes and around 20,000 shots this year.
The AI will help power the Masters’ “Round in Three Minutes” feature, which is a highlights package that serves fans across the tournament’s digital platforms a bite-sized summary of a single player’s round. Previously, the Masters’ editorial team produced these highlights manually, scouring and classifying video, sorting and packaging the best golf shots. Now, IBM Watson will scan those videos for them and serve up the most memorable moments minutes after each round has been completed. At the request of the Masters, Watson will provide extra footage leading up to each highlighted moment as well, so the team can better weave together stories for fans.
IBM Watson will analyze, rate, and curate the three-minute highlight reels by measuring the excitement level of each moment through facial expressions, athlete gestures, and the roar of the ground. It will additionally consider things such as the hole in which each stroke took place.
“We know the player, the shot, the hole they’re on, and we can automatically curate a highlight reel,” said John Kent, the program manager for IBM Sports and Entertainment Partnerships. “Think of it like a playlist of shots to play in sequence. We want to show the shots leading up to it to tell the story.”
IBM will also deploy a new capability within its artificial intelligence, called OpenScale, to remove possible bias while scanning these videos. That means that it will be able to better decipher between a mediocre shot from a star player like Tiger Woods and an impressive shot from a lower-tier player. Woods’s audience might have a higher decibel level simply because his celebrity status draws a larger crowd.
IBM will additionally power a new multichannel livestream across the Masters’ digital properties that will enable fans watching at home on their desktops to view up to four official Masters channels simultaneously. Fans tuning in via mobile devices will be able to watch more than one channel as well.
“The whole reason we can do this is because the bandwidth is there: 5G allows you to consume multiple screens at once,” Kent said. “In golf, there’s a fair amount of downtime as players walk from one hole to the next. Having multiple screens is useful because some holes may have action and others not.”
SportTechie Takeaway
IBM’s work at the Masters is helping to further teach its already complex Watson artificial intelligence platform, and creating opportunities for Big Blue to repackage and expand its capabilities to clients in other industries outside of sports.
“Everyone’s jobs are going to change because of artificial intelligence,” Kent said. “The Masters’ editorial team, their job has changed. They’re still curating content, but they’re getting assistance from an AI to help them produce at scale.”
Using the same visual recognition and sound scanning technologies that are helping Watson analyze golf highlights, IBM is now working with clients in industrial markets to analyze defects or potential system errors. One such client, KONE, which builds escalators, elevators, and automated entryways, has trained Watson AI on scraping and scratching noises that might indicate a structure in need of servicing. An assembly line company also uses Watson’s recognition technology to spot defects on the factory floor.