Virtual Production Accelerates in TV

Virtual production has existed in different forms in TV production for well over a decade. Examples include ABC’s Once Upon A Time and Revenge, each debuting in 2011. Both series used in-camera simulcam and tracking to previsualize virtual environments captured on green screens, with final composites accomplished traditionally during post-production.

The recent expansion of virtual production into in-camera visual effects with real-time game engines like Unreal Engine and LED volumes has impacted TV and streaming production. Disney Plus’ Star Wars streaming series, The Mandalorian, shot by Greig Fraser, ASC led the charge upon its release in November 2019. Following its successful example, several series are now in production and leveraging similar techniques.

LED in-camera virtual production is ideal for TV both in general and due to the challenges posed by the COVID-19 pandemic and its attendant safety protocols. As chronicled in previous AC coverage, virtual production with its reduced crew and travel requirements and remote-capable equipment and workflows offers critical advantages to shows working under safety constraints. Because so much is possible in decentralized and remote environments, virtual production evolved far faster in 2020 and 2021 than it might have under normal circumstances.

Capturing background plates for Our Flag Means Death in Puerto Rico.

When the health crisis finally abates, many of the benefits of virtual production and LED volumes are likely to continue favoring TV and streaming projects. Capturing visual effects in-camera allows for greater flexibility of shot design and coverage and shortened post-production versus the cost per shot/per frame of traditional visual effects. It also makes much more controlled shooting conditions possible than location work, such as holding on a perfect sunset or an ideal cloud formation indefinitely.

While some series such as Disney Plus’ Star Wars streaming shows were conceived from their inception for XR (mixed reality) stages, others pivoted in 2020 to LED walls. ABC’s Station 19 action-rescue show traded on-location emergency vehicle driving sequences for LED walls and rear-projected driving plates. Daryn Okada, ASC, who acts as both a producer and director on the series, helped oversee the new approach starting in May 2020.

“Our first challenge was figuring out how to work within the safety protocols without a result that looked affected by those restrictions,” remembers Okada. “Virtual production was always in the back of my mind, but it came down to space. We couldn’t find a stage for a large volume for our standing sets because everything was booked or on hold. But we had several plates we’d shot on location in Seattle we could repurpose for driving sequences on a more modest LED wall.”

Okada and crew collaborated closely with Sam Nicholson, ASC, and his Stargate Studios visual effects company to create a safe stage environment in Los Angeles. Nicholson also provided similar services for HBO’s Run series in 2020. “Our main content screen was approximately 12 feet wide, and modular so we could use it easily with any vehicle from any angle,” says Okada. “We’d also roll in three large LED monitors to wherever we wanted to see reflections in the rear and side mirrors or over the hood. We’d have four plates running together in sync with timecode, and the crew at Stargate kept track of exactly where we were in each take. So, if we wanted to pick up a new shot from within a scene, we could jump right to that piece of the background footage, and our coverage would match for continuity. That efficiency made a huge difference.”

So, if we wanted to pick up a new shot from within a scene, we could jump right to that piece of the background footage, and our coverage would match for continuity. That efficiency made a huge difference.

Daryn Okada, ASC

Paramount Plus also transitioned from traditional post-composited to an in-camera visual effects workflow on the upcoming season four of Star Trek: Discovery and season one of its spin-off, Star Trek: Brave New Worlds. Both shows share a 70’ wide by 30’ tall 270-degree horseshoe-shaped LED volume constructed by Pixomondo in Toronto, Canada, and fed real-time animation from Unreal Engine.

“In the COVID era, being able to shoot large scope locations without having to leave the stage is a huge benefit,” says Jason Zimmerman, visual effects supervisor for CBS’ Star Trek series as well as Clarice and The Man Who Fell to Earth. “The wall is fantastic for environments, and on Star Trek traveling to different worlds is obviously something we’re very interested in.”

“We’re using ROE’s Black Pearl BP2 v2 2.8mm LED panels for the wall and the Carbon series CB5 5.77mm panels for the ceiling,” explains Mahmoud Rahnama, Pixomondo Toronto’s head of studio. “The ceiling is fully customizable, so we can either take panels out and hang practical lights over the volume or just use the ceiling’s LEDs for lighting. We have more than 60 Optitrack motion capture cameras with the ability to track two cameras simultaneously and on Technocranes, Steadicams, dollies, etc.”

“A lot of shows around Hollywood are looking at this as an opportunity to advance filmmaking,” adds Zimmerman. “Getting something in camera on the day is just so much better than green screen in many ways. One major difference is that production design and art department are a lot more involved much earlier in the process and getting assets to a place where they can be photographed instead of waiting until after the shoot.”

While making the transition from a traditional visual effects pipeline to LED volumes is proving to be a positive change for some series, other shows are planned from the start to leverage in-camera visual effects. Disney released the second season of The Mandalorian in December 2020. The third season is now in production and is joined by other shows to be shot using StageCraft, ILM’s real-time animation pipeline for LED volumes.

These new shows include The Book of Boba Fett, set for release later in 2021, and Obi-Wan Kenobi, intended for release in 2022. Both shows will use the same volume initially built for The Mandalorian in Manhattan Beach, California. Still more Star Wars shows like Andor are in production at Pinewood Studios outside London in the UK, where ILM built another large-scale StageCraft volume.

Netflix, led by Director, Virtual Production Girish Balakrishnan, is taking a holistic approach with plenty of research and development into best practices. The goal is a standardized methodology that producers can consistently replicate across the many regions where the streamer creates original content. 1899, from creators Jantje Friese and Baran bo Odar, is one of the first major XR stage series out of the gate for Netflix and recently began production at Studio Babelsberg near Berlin, Germany.

1899 takes place at the turn of the century aboard a migrant ship sailing to the US from Europe. The series features an international and multilingual cast who must unravel a disturbing mystery. Pitched initially to be shot on location throughout Europe, the producers reconfigured their entire project for in-camera LED capture instead. Studio Babelsberg’s volume, the largest of its kind to date in Europe, is similar in size to The Mandalorian’s at approximately 75’ wide and 23’ tall.

Yet another major series in the works and leveraging LED wall/in-camera visual effects is HBO Max’s Our Flag Means Death. Taika Waititi directs the half-hour pirate-themed comedy and also stars as the infamous Blackbeard. Waititi’s experience starring in and directing episodes of The Mandalorian will surely come in handy as the game plan calls for surrounding a full-sized pirate ship with a massive LED volume on stage to simulate the open sea in-camera.

Stargate Studios CEO Sam Nicholson, ASC, again brings his considerable expertise in virtual production and in-camera VFX to Our Flag as visual effects supervisor. Nicholson supervised a plate capture shoot off the coast of Puerto Rico for the project using Blackmagic Ursa Mini Pro 12K and Pocket Cinema 6K cameras and Voigtlander Color-Skopar 20mm aspherical lenses. “We built a rig with eight Pocket 6K’s in a circular array, each shooting onto a 4TB Sandisk SSD,” reveals Nicholson. “It becomes a real data management challenge as we’re getting between 25K and 50K resolution for our final stitched images.”

A still frame from Station 19 showing in-camera VFX.

For the preliminary shoot, Nicholson’s plate team captured more than 200 TB of footage. “Each take is 5 minutes times eight cameras, which gives you 40 minutes per take,” notes Nicholson. “We also want to show dailies, so we capture everything into a Blackmagic ATEM Extreme ISO switcher. That gives us a 1080p, 8-way split we can share from the location each day and discuss what we’re shooting as we go. Everything ultimately has to be stitched, stabilized, de-grained, and prepared to be split back into one of fourteen 4K quadrants we’ll map onto the LED volume from Decklink 8K playout cards.”

Nicholson sought ways to make the plate capture both high quality and cost-effective. “Everyone kind of forgets that you’re only as good as the color depth of your wall, and the LED volume is only 10-bit color,” he observes. “I love the Supreme and Master Primes, but if you need to make a rig using them, that’s a $35,000 lens times eight cameras. Instead of using dedicated onboard camera memory that could quickly run $20,000 per body, we’re using off-the-shelf Sandisk SSDs at 1/10th the cost. The cameras and data are evolving so fast that today’s solution may be completely different six months from now.”

TV production, like other forms of filmmaking, is constantly in a state of technological flux. Its evolutionary milestones include moving from analog to digital, from standard definition to high-definition, from green screen to in-camera visual effects, and more. The trajectory of virtual production is exceptionally high, driven in part by the safety demands of the pandemic.

What was cutting-edge just a few years ago with The Mandalorian is quickly becoming a standard practice. Though some series in production are leveraging LED volumes to circumvent travel and distancing restrictions, the utility of the process will likely persist far beyond the crisis and be another key milestone for the TV and streaming world. The genie is not going back into its bottle.

An 8-camera array of Blackmagic Pocket cameras capturing footage in Puerto Rico.

This story originally appeared in the May 2021 issue of American Cinematographer magazine.

-Presented By-