The jaw-dropping visual effects in “The Matrix” transformed the quest to prove what is possible on screen. The franchise returns this week to find out if there is anything more that can be done.

For the 1999 original, filmmakers invented a way to make Keanu Reeves’s hero, Neo, defy physics while dodging bullets on screen. The effect blew enough minds to get a nickname—“bullet time”—plus changed the look of action movies, and influenced mediums from animation to videogames.

For the new sequel, “The Matrix Resurrections,” filmmakers deployed much-higher-caliber technologies, including three-dimensional imagery made using artificial intelligence. But after 22 years of digital evolution, high-end movie effects are approaching a plateau near perfection.

“We went from pulling off what seemed to be impossible, to a sort of inability to create surprise” in the movie industry, says John Gaeta, who helped craft the bullet-time effect. He was a visual-effects designer on the first three “Matrix” films; now he is making things for the metaverse.

A ring of cameras helped create the ‘bullet time’ effect that let Keanu Reeves and Hugo Weaving defy physics in ‘The Matrix.’

Photo: Warner Bros/Everett Collection

This year the movies presented us with a car slingshotting from cliff to cliff (“F9”); Ryan Reynolds running amok inside a videogame (“Free Guy”); and giant monsters crushing the Hong Kong skyline (“Godzilla vs. Kong”). Any viewers who paused to ask themselves—“How did they do that?”—likely came up with the same answer: “Computers.”

Human characters that are totally computer-generated and believable are still on the frontier, “but I’m not sure if there is anything else that can’t be done given enough money or time,” says Ian Failes, editor of befores & afters, a magazine covering visual-effects artistry.

Despite any numbness among viewers to digital spectacles, Hollywood’s demand for them has only increased. Visual-effects houses have raced to compete in a global production boom and fuel the streaming wars with flashy content.

Some directors are reacting to the VFX arms race by practicing more restraint. Denis Villeneuve’s “Dune” depicts settings such as the desert planet Arrakis with a naturalistic look. Instead of zooming viewers into a fleet of attacking space ships, the director presented the nighttime ambush in silhouette at a distance, conveying a somber sense of scale.

“He was just showing the reality of the world,” says Namit Malhotra, chief executive of DNEG, a visual-effects company that worked on “Dune” and “The Matrix Resurrections.” He adds: “When you’re spending that kind of money, it’s hard for filmmakers to control the desire for more, a little more oomph.”

In the new “Matrix” release, director and co-writer Lana Wachowski plays with expectations that the sequel must level up. Spoiler alert: In the movie, Mr. Reeves’s character is reintroduced as a videogame designer whose big hit was called, yes, “The Matrix.” The events in the film franchise supposedly happened within the world of his videogame—including that signature action sequence in which Neo bends time and space. As a group of videogame developers brainstorm ideas for a sequel to “The Matrix,” one declares, “We need a new bullet time!”

The original bullet time was “a borderline hack,” as Mr. Gaeta recalls it, that started with 120 still cameras firing off film photographs of Mr. Reeves dangling on wires. Those images were stitched together with software to simulate a swooping camera move in slow motion.

The successor to that technique is known as volumetric capture. A camera array captures people or spaces from every angle, and then A.I. meshes this video into 3-D footage that can be viewed and manipulated from any perspective.

Cinematographer Daniele Massaccesi on the set of ‘The Matrix Resurrections’ with director Lana Wachowski, who aimed for a product that felt ‘more real.’

Photo: Warner Bros. Pictures

The “Matrix Resurrections” team, mindful of the movie’s bullet-time joke, incorporated volumetric capture sparingly, says visual-effects supervisor Dan Glass. “It’s really cutting-edge technology, but we deliberately didn’t want to use it in a way that was as blatant as the original bullet time.”

At the same time, Ms. Wachowski also relied on real-life filming locations in San Francisco, and physical effects—such as a wired Mr. Reeves leaping off a 43-story building with co-star Carrie-Anne Moss —to help bring her story’s meta premise to life.

“The idea is that this is a newer upgrade to the Matrix simulation, so Lana wanted it to feel more real as we filmed it,” says Mr. Glass, who has worked with the director since the first “Matrix” sequels.

Yahya Abdul-Mateen II as Morpheus in ‘The Matrix Resurrections,’ the latest installment in ‘The Matrix’ franchise.

Photo: Warner Bros. Pictures release.

In 1999, “The Matrix” reverberated in a culture increasingly fixated on computers, the internet and millennial paranoia. Made on the cusp between analog and digital technologies, the movie’s visual effects won an Oscar, upstaging the computer-generated creatures of the “Star Wars” prequel “The Phantom Menace.”

Today, alumni from the original “The Matrix” effects team are pushing for breakthroughs in adjacent mediums. Kim Libreri, who worked on the original trilogy’s visual effects, is now chief technology officer of “Fortnite” maker Epic Games Inc. He recently re-teamed with Ms. Wachowski on a “Matrix”-themed project demonstrating Epic’s Unreal Engine, which creators use to fabricate 3-D worlds.

Filmmakers are adopting these tools to construct environments in which they can move virtual cameras around in real time, as if in a videogame. In the new “Matrix” movie, Unreal Engine generated settings for a sparring match between Neo and Morpheus ( Yahya Abdul-Mateen II ) in a martial-arts dojo, an update to a scene from the original film.

“Spontaneity has been sucked out of the industry because of all the complexity of setting up visual effects,” Mr. Libreri says. “What we’re hoping to do with the real-time technology is bring happy accidents back into the mix.”

He makes a cameo in “The Matrix Resurrections” with Mr. Gaeta, who has also been working with Ms. Wachowski to develop potential offshoots for the “Matrix” franchise. For the next big thing, the former designer of movie effects looks to the 3-D worlds for the virtual ecosystem known as the metaverse.

It is a shift foreshadowed in some ways by a movie about a reality constructed from computer code, Mr. Gaeta says: “In 20 years, we’ve gone from concepts and illusions to borderline actualization of these things.”

Write to John Jurgensen at [email protected]

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

This post first appeared on wsj.com

You May Also Like

YouTube says music artists’ income could fall if it must pay more for streams

Platform claims proposed legislation could be so difficult to adhere to that…

The Spy Who Dumped the CIA, Went to Therapy, and Now Makes Incredible Television

“Did you learn things in CIA training about withstanding interrogation that are…

What is a Pink Full Moon? Lunar phase set to dominate sky over Easter weekend

The full moon this month is known as the Pink Moon, not…

What is the Metaverse? The Future Vision for the Internet

Resume Subscription We are delighted that you’d like to resume your subscription.…