Writer/director Emma Needell’s short film Life Rendered premiered in June 2022 at the Tribeca Film Festival. Needell leaned heavily into hybrid virtual production and motion capture techniques, leveraging Unreal Engine to depict her sci-fi story.
Originally from Elbert, Colorado, Needell stepped into the spotlight when a script she wrote called The Water Man was made into a feature film by actor/director David Oyelowo. The Oprah Winfrey-produced project debuted at the 2020 Toronto International Film Festival and was distributed on Netflix.
The Water Man helped Needell gather support for her directing ambitions. A longtime fan of video games such as The Sims, The Last of Us, and Far Cry, Needell recognized that game engine technology was progressing to a quality level sufficient for filmmaking. “The cutscenes were so evocative and emotional, they made me cry,” she recalls. “I’d also seen The Mandalorian and heard of Unreal Engine. I learned about Epic’s MegaGrant and Fellowship programs, which sounded great for someone like me who is very computer literate but still in over my head.”
Getting the MegaGrant
With the support of the Fellowship and a MegaGrant, Needell conceived the story of Life Rendered with the idea of mixing live-action with Unreal Engine animation driven by performance capture. “Our story is about a gay man who lives in rural Colorado and is a caretaker for his disabled cowboy father in a near future where VR is ubiquitous,” Needell explains. “I didn’t want to redo Ready Player One; I wanted something grounded and human with the visual style of director Terrence Malick.”
The overall concept called for shooting some scenes in locations in Colorado and others in a virtual world via motion capture. Needell collaborated with Eric Day, an executive producer at Los Angeles motion capture studio Ryot. “Emma’s Fellowship training helped because she could essentially previs her entire short with Unreal Engine,” Day says. “Maybe it’s not always the final camera placement, but it was enough to get into budgeting and scheduling well before getting into the heavier and more expensive work.”
CounterPunch Studio in Los Angeles created custom avatars and specialized animation for the project. The virtual scenes were captured in two phases, the first being a motion capture pass for the actors’ bodies and facial performances using Ryot’s Vicon motion capture volume. The crew captured facial performances with Cubic Motion’s facial capture technology combined with iPhone-based, head-mounted camera rigs provided by Standard Deviation. Once the performances were captured, a second phase began with a pass for virtual camera operation.
Live-Action Aesthetics
To carry the aesthetics of the live-action part of the shoot over to the virtual world, Needell insisted on having her cinematographer Anton Fresco operate a tracked virtual camera in Ryot’s motion capture volume. “All the handheld movement you see in the virtual scenes has a real human feel,” says Needell. “It’s a massive part of the tone and theme for the entire project.
“I also wanted to recreate the look of the anamorphic lenses we used in the live-action shoot,” Needell continues. “Miles Perkins and Karen Dufilho at Epic Games connected us with Jason Chen at BRON Digital, who digitally mapped Panavision anamorphic lenses for use in Unreal. So, we could recreate a similar look between the physical and virtual cinematography.”
For the virtual camera capture sessions, the Ryot team was able to use the recently released MetaHuman Creator to create virtual stand-ins. “It was cool because our final avatar rigs weren’t ready on the virtual camera capture day,” reveals Day. “So, we spent an extra thirty minutes before shooting, making MetaHumans that looked similar enough to the original actors. Having that tool integrated into the Unreal ecosystem was a total lifesaver.”
With the production phase completed, Needell oversaw post-production virtually via many Zoom review calls. “CounterPunch Studios did entire scenes within Unreal,” she explains. “What we got out was the whole scene with all the cuts, so it wasn’t shot-by-shot editing in traditional editing software. That approach makes it more challenging to tweak individual shots, but if you’re going to have a lot of continuity or visual effects like wind or snow, it’s a lot easier just to load the whole scene and play it all the way through.”
Premiere Thoughts
As Life Rendered premiered at Tribeca, Needell took stock of her experiences with virtual production. “Filmmaking is always creative, but that creativity often comes out of production being very unpredictable,” she observes. “That said, what I love about working with Unreal Engine is that you can come up with whatever scene you want without worrying about any logistical nightmare or whether something is dangerous—because no one’s safety is ever worth risking over a movie.”
Beyond the convenience and freedom of virtual production, Needell also appreciates its sustainability. “Filmmaking at a professional level requires a lot of people and equipment, and it can be very ecologically damaging to do on location,” she says. “We’re in the middle of a climate emergency, and the kind of filmmaking I want to do in remote environments could be harmful.”
Needell hopes to deliver additional projects using virtual production in more elaborate ways. “Tools like Unreal Engine and the Quixel library enable you to get amazing visuals quickly and ethically while factoring in a more sustainable ecological footprint,” adds Needell. “I was also very humbled by all the help I received from Epic Games as a first-time director. Hollywood is a world of barriers and hurdles, and everyone I worked with at Epic has been highly supportive, which is unique and special.”
For more information about Emma Needell and Life Rendered, please visit https://www.life-rendered-the-film.com/
This story originally appeared in the Winter 2022 issue of the Epic Games L.A. Lab Magazine, check out the full issue here: https://www.unrealengine.com/en-US/virtual-production
Leave a Reply