Jake Sully (Sam Worthington) in 20th Century Studios' AVATAR: THE WAY OF WATER. Photo courtesy of 20th Century Studios. � 2022 20th Century Studios. All Rights Reserved.

Avatar: The Way of Water Production Tools

In this story, Virtual Producer looks at how the production and post-production of Avatar: The Way of Water were accomplished by leveraging a variety of Blackmagic equipment.

Image courtesy of 20th Century Studios.

As the newest and much-anticipated sequel to Lightstorm Entertainment’s Avatar series began pre-production, the team understood that while the story and visuals would need to be taken to the next level, so would the technology that supported it. Avatar: The Way of Water would push the skills and capabilities of the most advanced production pipeline, so Geoff Burdick, Lightstorm’s senior vice president of production services and technology, began looking for ways to handle the new demands.

Set more than a decade after the events of the first film, Avatar: The Way of Water tells the story of the Sully family (Jake, Neytiri, and their kids), the trouble that follows them, the lengths they go to keep each other safe, the battles they fight to stay alive, and the tragedies they endure. Produced by Lightstorm Entertainment’s James Cameron and Jon Landau, the film was directed by Cameron and distributed by 20th Century Studios.

Avatar: The Way of Water Official Trailer

Production Playback and Monitoring

Managing a massive pipeline with an Avatar production is more than just data processing; it also provides the tools to evaluate content as it’s being shot. “We evaluate live camera feeds in a manner as close to the theatrical experience as possible, so we can make real-time decisions on set,” says Burdick. “This saves time during shooting, benefits Weta Digital, our visual effects vendor, and helps streamline our post-production and mastering process.”

Production intended to shoot 4K HDR at a 47.952 frame rate which would support the stereoscopic process but feeding that amount of data on set was a complex ask at the time. “We needed to enable that spec through our entire production pipeline, involving real-time feeds to our DCI compliant ‘projection pod,’ which we used to view live camera feeds in 3D 48fps in both 2K and 4K, 3D 24fps in 2K and 4K, and 3D 24fps in HD,” says Burdick. “There wasn’t a lot of existing hardware available to support that.”

Burdick and his team contacted Blackmagic Design early on, explaining their goals. “There were no instant answers, but they understood the vision and had ideas for the best pathways to make it happen,” adds Burdick.

Image courtesy of 20th Century Studios.

Working closely with the production’s 3D Systems Engineer, Robin Charters, Burdick, and his team began to drill down on every aspect of functionality. They chose to incorporate the Teranex AV standards converter, Smart Videohub 12G 40×40 router, DeckLink 8K Pro capture and playback card, UltraStudio 4K Extreme 3 capture and playback device, and ATEM 4 M/E Broadcast Studio 4K live production switcher as the management hardware for the various feeds.

“During live-action photography in 2019 and 2020, the Blackmagic team was in constant contact, ensuring that every piece of their hardware performed perfectly,” notes Burdick.

The pipeline, with real-time conversions handled by the Teranex AV and fed through the Smart Videohub 12G 40×40 and the ATEM 4 M/E Broadcast Studio 4K to provide playback and review throughout the set, worked seamlessly. While being able to review footage immediately, the value of having a multi-resolution playback system also served as a necessary quality control solution.

“This is very important as we move into shooting higher resolutions, frame rates, and dynamic ranges, with exhibition technologies capable of displaying all this and more,” says Burdick. “As critical as the cutting-edge technology is, it’s all in service of the story. The goal is for people not to notice the tech. We know we’ve succeeded when the audience loses themselves in the movie.”

Image courtesy of 20th Century Studios.

Grading The Way of Water

Colorist Tashi Trieu has worked with James Cameron’s Lightstorm Entertainment for several years as a DI editor, including the remaster of Terminator 2 and Alita: Battle Angel. For Avatar: The Way of Water, Trieu moved up to colorist, working closely with Director Cameron.

I assume you worked closely with Director James Cameron, developing looks before production. Can you talk about that process?

I was loosely involved in pre-production after we finished Alita: Battle Angel in early 2019. I looked at early stereo tests with Director of Photography Russell Carpenter. I was blown away by the precision and specificity of those tests. Polarized reflections are a real challenge in stereo as they result in different brightness levels and textures between the eyes that can degrade the stereo effect. I remember them testing multiple swatches of black paint to find the one that retained the least amount of polarization. I had never been a part of such detailed camera tests before.

Image courtesy of 20th Century Studios.

The look development was majorly done at WetaFX. Jim has a close relationship with them. As the principal visual effects vendor on the project, their artistry is thoroughly ingrained in everything from live-action capture to fully CGI shots. Their approach left a lot of creative latitude for us in the DI and our show LUT is an elegantly simple S curve with a straightforward gamut mapping from SGamut3.Cine to P3D65. This left plenty of flexibility to push moments of the film more pastel or into a photorealistic rendition.

Naturally, a lot of this movie takes place underwater. One of our priorities was maintaining photorealism through huge volumes of water. That means grading volume density to convey a sense of scale. Closeups can be clear, contrasty, and vividly saturated, but as you increase the distance from a subject, the spectrum fades away to blue, even in the clearest water. This was something we could dial in quickly and interactively in the DI. Anytime we needed to convey depth, we’d add more blue and subtract red and green.

What was it like working with Cameron during the mastering process?

I’ve never worked with a director who can so quickly and precisely communicate their creative intention. I was blown away by his attention to detail and ability to instinctually make thoughtful and creative decisions – he would often voice his rationale for even simple grading and framing decisions. As groundbreaking as the film is, his priorities never stray from the characters, the story, and enhancing the audience’s connection with them.

You grade with DaVinci Resolve Studio. Did your work remain confined to the Color page, or did you use other pages such as Fusion or Edit?

I have a background as a DI editor, so I’m very hands-on in the conform and editorial process. I spent almost as much time on the Edit page as I did in Color. I didn’t go into Fusion on this job, but that’s mostly due to the improvements in the ResolveFX toolset. Almost everything I needed to do beyond the grade could be done with those tools on the Color page. This was advantageous because those grades could be easily ColorTraced and propagated across multiple simultaneous grades for different aspect ratios and light levels.

Can you share with us a workflow you use and how it is affected by the stereoscopic work?

I’m a big fan of keeping things simple and automating what I can. I made heavy use of the Resolve Python API on this project. I wrote a system for indexing VFX deliveries once they arrived at the DI so that my DI Editor, Tim Willis from Park Road Post, and I could very quickly load up the latest versions of shots. I could take an EDL of what I currently had in the cut and in seconds, have an update layer of all the latest shots so we could make our final stereo reviews in scene context.

This film was doubly challenging, not only because of the stereo but also because we’re working at high frame rate (48fps). Real-time performance is difficult to guarantee even on a state-of-the-art workstation with four Nvidia A6000 GPUs. It’s a delicate balance between what’s sustainable over the SAN’s bandwidth and what’s gentle enough for the system to decode quickly. Every shot was delivered as OpenEXR frames with as many as five or six layers of mattes for me to use in the grade. Ian Bidgood at Park Road had a clever idea to have WetaFX write the RGB layer as uncompressed data, but ZIP compress the mattes within the same file. This meant we had rock solid playback performance, really fast rendering for deliverables, and the file sizes were barely more than if they didn’t contain the mattes.

Image courtesy of 20th Century Studios.

Were there challenges between the SDR and HDR grades?

We had the unique luxury of working in Dolby Vision 3D from day one. Our hero grade was the Dolby 3D version at 14fL in extended dynamic range. This is a great way to work because you can see everything well. This is critical in stereo reviews, where you need to see if the compositing is working correctly or if tweaks need to be made.

Once you grade for Dolby Vision, standard digital cinema 2D at 14fL is a relatively simple transformation with some custom trims. You lose your deep blacks with a traditional DLP projector, but it’s just as bright as Dolby 3D. One of the biggest challenges was creating the 3.5fL grade that, unfortunately, is the standard for most commercial 3D digital cinemas out there. It’s an exacting process to create the illusion of contrast and saturation with so little light. We must make certain decisions and allow background highlights to roll off early to preserve contrast in high dynamic range scenes, like day exteriors. Night scenes are much more forgiving.

Image courtesy of 20th Century Studios.

What’s your go-to tool in DaVinci Resolve Studio?

ColorTrace was critical for me on this film. Each reel of the film was contained in its own Resolve project, each containing ultimately eleven unique timelines for each of our various theatrical picture deliverables. Tim Willis, my DI editor, kept those in editorial parity across cut changes and VFX updates. When we’d lock a grade in one format, I’d ColorTrace those grades to other formats and trim those further. If we made changes to framing and composition in one, I could easily ripple those changes back through the other formats without overwriting the grading decisions. It’s simple, but the amount of time saved and the elegance of that sort of workflow kept us from working too many late nights.

Was there a favorite scene from the movie that you loved grading or presented a unique challenge?

A town hall scene between the Sully family and the Metkayina clan occurs during a rainstorm. It’s a gorgeous scene that evokes Rembrandt. The cold, overcast skies wrap around the characters, and a subtle warm accent light gives the scene a nice dynamic. It’s insane how absolutely real everyone looks. You have to remind yourself that everything in this shot is artist-generated. Nothing beyond the actors’ performances is real. It’s truly a generational leap in visual effects artistry and technology.

Avatar: The Way of Water Official BTS Featurette.

To learn more about Avatar: The Way of Water, visit: https://www.avatar.com/movies/avatar-the-way-of-water, and to learn more about Blackmagic Design, visit: https://www.blackmagicdesign.com/

-Presented By-