Avatar: The Way of Water is not the typical stereo 3D film of yesteryear. Director James Cameron and his tech wizards at Lightstorm Entertainment, along with the industry’s top VFX studio, Weta FX, raised the bar for stereo and visual effects, taking both to a whole new level in this Oscar-winning film. It’s been 13 years since the original record-setting Avatar film, and during that time, technology has continued to evolve. The filmmakers took advantage of those developments, also creating new ones of their own, to take audiences back to the unique world of Pandora, with its unique characters, while adding new and wonderous characters and environments to that familiar universe. The film is a celebration of moviemaking and deserves to be honored as we celebrate National 3-D Day.
Despite the declaration by some critics out there proclaiming the slow demise of stereo 3D, the genre is, in fact, very much alive and well. If you have any doubts, consider Avatar: The Way of Water. After just 13 weeks since its release December 16, 2022, this highly anticipated film has pulled in $2.295 billion at the box office (and counting). Currently, it sits in at No. 3 as the all-time box-office leader, chasing 2019’s Avengers: End Game and the king of the list, 2009’s original Avatar. All shown in stereo 3D.
Today, March 21, 2023, is National 3-D Day, in case you didn’t know. Yes, it’s a real thing, founded by 3-D Space in 2020 to recognize the history and resurgence of 3D technology. The registrar at National Day Calendar has established the third day of the third full week of the third month, naturally, as National 3-D Day. Here at Jon Peddie Research, we constantly discuss the cutting-edge announcements and accomplishments achieved using 3D technologies. Last year, we celebrated 3-D Day with a story that took our audience through the history of stereo 3D, from the early, early days, to its use in games, to stereo displays, and 3D glasses. This year, what better way to mark the occasion than featuring an application that takes stereo 3D beyond what we have ever seen before, with a look at the Oscar-winning Avatar: The Way of Water.
Here, Sam Cole, associate visual effects supervisor at Weta FX, discusses stereo 3D and takes us behind the scenes of how it was used to immerse viewers into the unique world of Avatar: The Way of Water.
From the outset, Weta FX partnered with Lightstorm Entertainment on the film, collaborating on new technology and techniques to fully realize director James Cameron’s amazing vision for the film. Weta is no stranger to pushing the bar in terms of visual effects, with work on such films as The Lord of the Rings trilogy, King Kong, The Jungle Book, and Avatar, for which they have won several Academy Awards and BAFTAs.
In addition to creating the flora and fauna of Pandora, Weta forged new developments for water in the sequel—no easy feat by any means. Avatar: The Way of Water picks up a decade after the first film, as Jake Sully is living peacefully with his new family on Pandora. That is, until humans return in a second attempt to colonize it and, in the process, destroy it. Meanwhile, Jake and Neytiri relocate to live among the Metkayina, who live along the Pandora shores that’s teeming with aquatic life. But, they cannot escape the fight on their doorstep.
The sequel builds on the forest and jungle environments from the original Avatar, as Weta increased the complexity, detail, and realism, while keeping it recognizable as Pandora. New is the water environment and the Metkayina village, including a 192 sq. km reef. In addition, there are new water-based creatures and ikran (mountain banshees) from the original Avatar.
The final film contains 3,289 shots—3,240 of which contain VFX shots by Weta—making it the largest visual effects film Weta has ever worked on. And, 2,225 of those were water shots.
Avatar: The Way of Water is a sci-fi film with a message of environmental conservation, and, in that regard, features emotionally engaging digital characters who interact with real-world counterparts mostly within CG environments. (In all, 30 principle speaking CG character were created, with over 3,000 facial performances that were tracked, solved, and animated.) As a result of all the CG and live-action interaction, Weta had to expand on the virtual production workflows used in the original Avatar to more cohesively integrate performance capture, on-set lighting, and broader 3D decision-making.
The evolution of stereo 3D
Avatar: The Way of Water once again drew people back to the theater, providing them with a sophisticated stereo 3D experience. Stereo 3D has evolved greatly over the past decades, especially since we last visited Pandora, when a new pinnacle for stereo 3D was established. Since then, it is apparent that neither Cameron and Lightstorm nor Weta have rested on their laurels. Presenting something innovative and new was obviously top of mind.
As Cole points out, the stereo in the Avatar sequel is more immersive than anything previous, and that’s because filmmakers including Weta have much more experience with the genre and are able to make more enlightened and informed choices that draw the audience in, provide a better sense of depth, “and essentially make the cinema disappear and place people in that world.” There are literally hundreds of filmmaking tricks to achieving that, he says—not necessarily VFX tricks, but rather those based on human perception. The rules that are built around stereo 3D are not hard and fast, and over time, people have built up a good eye for a polished aesthetic, since a lot has to do with personal taste, he explains.
Also, laser projection and brighter projection, along with the increased quality of 3D glasses, have improved the stereo picture to where every seat in a theater provides good stereo viewing. In fact, increasing the brightness level has a tremendous impact on the viewing experience—stereo glasses cut the brightness level essentially in half (as one eye is getting one picture, and the other eye is getting a different picture). “Films mastered for high dynamic range and projected using the latest technologies look amazing in stereo 3D—bright, clear, and with amazing color.” Avatar: The Way of Water is a perfect example.
In terms of the visual effects, Cole notes that the software, pipelines, and processes are more advanced, more mature, and faster today than they were even just a few years ago. This is important, explains Cole, because the more decisions an artist can make per minute, the better the image becomes, generally. “A lot of those processes have come down to real time or very close to real time in addressing stereo notes or trying stereo options. It’s no longer, put an image on the render wall and wait to see it and hope it works. We’re now able to work at the speed of decision when prototyping stereo,” says Cole.
While working on a stereo film, Weta uses on-site stereo projection rooms, and some locales even have full-size theaters where they view dailies. Cole and others also have desktop stereo displays from various manufacturers, ranging from medical-grade displays to traditional stereo monitors. So, artists can see their work projected large as well as smaller. Cole recommends doing both and as often as possible, because seeing something in 60 feet is much different than seeing it at a few inches. Artists at Weta and elsewhere sometimes use another debugging method, which involves “flipping” between the left and right images quickly, mitigating any accommodation the human visual system is employing.
The Way of Water, in stereo 3D
There are two types of stereo 3D films: those in which the stereo is produced natively, as the film is created, and those that add the stereo afterward through conversation by an outside company. With the latter method, the bulk of the material leaves a VFX facility as a series of flat 2D layers.
The Avatar sequel used native stereo, so the stereo workflow and processes were meticulously planned from day one, and the shots were designed to work in stereo. Therefore, there were very few curveballs, says Cole.
In fact, a live feed of stereo from the set was constantly monitored by Geoff Burdick, senior VP of production services and technology at Lightstorm, who maintained steady contact with Cameron and Maria Battle-Campbell, first assistant director, who were on set watching a separate feed at projection size. They were seeing it directly and making stereo adjustments, calling the shots over the radio as the film was being shot, so there was no guesswork involved. “It makes a big, big difference,” says Cole. He adds that using finely calibrated cameras and robust rigs are also important to getting a good result.
With stereo, things that artists could get away with in mono are no longer acceptable; cheats typically done in traditional flat compositing or flat VFX will not work. Tolerances for all the camera tracking and matchmoving had to be at sub-pixel level. Eyelines had to be exact. A new performance-driven eyeline system allowed live-action actors to naturally interact with 9-foot-tall CG Na’vi characters on stage using a computer-controlled cable system with a screen and speakers. The performance of the CG characters was displayed at the appropriate location and height on set, depending on the action in the shot, allowing live-action performers to make spontaneous and accurate acting choices in the moment.
With native stereo, a consistent volume must be maintained between the photography and the CG in order for the stereo world to work correctly. This means there must be enough depth for a character to walk between two objects, for instance. To accomplish this, Weta developed a real-time depth compositing system that provides a pixel-perfect, real-time composite in-camera, without a greenscreen, accurately positioning the live-action and CG elements together and producing a very close approximation of the final shot while still on set. For this, tiny computer vision cameras were mounted on the picture camera, and the footage from them was fed into a very small but very fast neural network that derived the depth of everything in the scene in real time. That was then fed back into the stage software, where the shot was composited in real time.
To that effect, the water generated a lot of spray, mist, and so forth atop the water surface, which also had to be treated as a volume. “It’s a very tricky shot integration to take a plate of mist and water being thrown in the air, and then insert a CG character right into the middle of it,” says Cole.
Additionally, a first-of-its-kind underwater performance system was developed at Lightstorm that used discrete volumes to capture the precise movements of actors underwater and when they transitioned above the surface, while having a greater capability to capture body and facial motion from up to 23 actors at a time.
Of course, CG water is never easy under any condition, and with stereo, it is doubly difficult. With water being so important in this film, Weta rebuilt its entire simulation approach to enable a new level of realism and interaction above and below the water’s surface.
To help provide a sense of scale as the characters are underwater, artists generated marine snow as a very subtle barrier between the viewer and the characters. Also, the debris had to naturally part in the water as the camera housing pushed through the volume. “That interplay between the plankton and the simulation really gives the impression of being in a big volume of water,” Cole notes.
Furthermore, the water surface is partially reflective, and in stereo, each eye can get a slightly different image, and that can reduce immersion. To rectify this, the Weta team wrote tools that would enable them to reduce or tune the stereo specifically in just the brightest parts of the reflection, making it more immersive.
And that was not the only specialized tool required for the VFX work on this stereo film. There were many. At the start of the project, Weta conducted an R&D effort to define everything the artists would need. Still, the digital water required constant small changes throughout. “We couldn’t imagine on day one all the tools we would be using at the end,” Cole adds.
Why stereo 3D?
A film like Avatar: The Way of Water requires a tremendous amount of technical innovation—in terms of the overall VFX and to accommodate the cutting-edge stereo 3D. Some of the work has been described already, but there is so much more. The Weta team came up with a new facial system called APFS that utilizes a neural network to help establish a muscle-based activation model, measuring the expansion and contraction of the performer’s underlying facial muscle system.
According to Weta, a total of 18.5PB of data was stored for this film—which is 18.5 the amount used on the original film. In terms of the work required, this film had 10 VFX supervisors and nine animation supervisors; typically, a Weta project would have one or two of each.
So, with stereo 3D requiring so much additional work and development, why bother? “Stereo is the closest way that you have to taking somebody’s eyes and putting them on Pandora. It’s the most immersive and the truest to life—it’s how we see,” says Cole. “If you want to immerse somebody in something, you wouldn’t leave off half of one of the senses. Our stereo vision is so ingrained in how we see things, you really want the walls of the theater to disappear and to be drawn into the world. That’s what films do, try to make you believe that you are there, in the middle of it all. Incorporating your stereo sense just furthers your immersion into the film.”
Why, then, aren’t there more stereo films being made? Cole offers a reasonable explanation: Because the process is difficult to do. They are a lot more time consuming to make, they require extra equipment and extra teams to manage the equipment, and more. Yet, Cole believes the appetite for stereo exists—as long as the content is compelling. “You got to make mind-blowing content,” he says, adding that the stereo has to be sympathetic to the story that it’s serving.
And that is certainly the case with Avatar: The Way of Water.
Learn more about the trends shaping and driving the digital content creation market with Jon Peddie Research’s comprehensive 2023 DCC report.