
Illustration by Israel G. Vargas
For most of the 36-year-old Weather Channel’s existence, technology limited weather presentation to narration, archival footage, and basic graphics. “We needed great tools to convey great stories on air—that wasn’t just pointing at maps,” explains Michael Potts, TWC’s vice president of design. “How do we start doing richer, more scientifically accurate, and engaging concepts beyond just weather forecasting?”
The Weather Channel landed on augmented reality (AR) technology, the seamless blending of computer-generated imagery (CGI) with live content to better connect with its viewers. “We kept our ear to the ground, and augmented reality was on our radar around 2013–2014,” Potts says. “Traditional CGI work for broadcast is done in postproduction,” he continues. “But with AR technology, we can do it live, in real time, and have our talent react to it.”
The Weather Channel debuted this new tech in June, when meteorologist Jim Cantore “rode out” a digital tornado. During the broadcast, he “dodged” falling debris, including a downed power line and even a mangled sedan. When the camera moved, computers interpreted the change and rendered the digital elements from the new angle instantaneously. Again in July, the tech was used by meteorologist Mike Bettes to explain lightning safety. As a digital strike exploded a telephone pole near him, studio lights synced to the effect.
Public reaction has been overwhelmingly positive. “We’re transforming spaces and building new studios,” says Potts. “We broadcast a lot of live TV. I could see a day where we’re doing all of our coverage using immersive AR environments.”
This article originally appeared in our November 2018 issue.