GDC 2023 and the State of Unreal

Introduction 

The Games Developers Conference (GDC) returned to San Francisco in March 2023. So much has changed in the worlds of gaming and filmmaking during the last few years. Unreal Engine quickly became a leading digital content creation tool for virtual production and more traditional visual effects for media. 

I was invited to view the State of Unreal keynote presentation, which kicked off GDC at the Yerba Buena Center for the Arts Theater. The event happened next door to Moscone Center in downtown San Francisco where the rest of GDC occurred. Although the conference focuses primarily on video games, Epic Games also touted several new features in the latest Unreal Engine version of interest to virtual production folks like myself.

State of Unreal Presentation at GDC 20223

What’s New in 5.2

New features in Unreal Engine 5.2 with promise for virtual production include procedural generation tools, MetaHuman Animator, and the Fab marketplace. With procedural generation, highly detailed LED volumes and visualization environments can be created with much less effort. Instead of having to lay out every last tree and rock, procedural generation creates a massive amount of detail according to just a few rules. It can be quickly modified as well as utilized alongside hand-placed custom assets. 

Virtual production already leverages metahumans as realistic extras and highly complex central characters. With the new updates in 5.2, MetaHuman facial animation takes a giant leap forward. The presentation included a demo with well-known mocap actress Melina Juergens, who stars in the upcoming game Senua’s Saga: Hellblade II

MetaHuman facial capture demo at State of Unreal/GDC 2023.

Juergens recorded a facial performance using an iPhone on a tripod in this quick demo. Just moments later, another artist processed and applied the capture to a MetaHuman character in Unreal Engine with expressive and realistic results. Facial animation at this level of quality typically requires specialized capture equipment and detailed post-processing. 

The Unreal Engine for Fortnite (UEFN) allows users to design, develop, and publish content directly to Fortnite. That opens up a whole new category of virtual production creation and experience tools for filmmakers. Matt Workman is already porting his cinematography creation app for UEFN.

Matt Workman tries Unreal Engine for Fortnite.

Finally, Epic unveiled the Fab universal marketplace, combining the Unreal Engine Marketplace, Quixel Bridge, Quixel Megascans, ArtStation, and Sketchfab into a unified, open marketplace. Fab promises to be a one-stop shop for realistic assets for LED volume environments and other visual effects. The team revealed plans to make discovering camera-ready assets much more accessible through its search engine. 

Hitting the Show Floor

Hitting the show floor at the Moscone Center.

After the State of Unreal presentation, I visited the GDC convention floor. I checked out many developers, production companies, publishers, game enthusiasts, and professionals. I was also invited to sit down and chat with Kim Libreri, Epic Games’ Chief Technology Officer, about the present and future of Unreal Engine for virtual production. Libreri is a veteran visual effects professional with credits ranging from The Matrix to Star Wars: The Force Awakens.

“One super relevant thing for filmmakers this year in Unreal Engine is the Rivian truck in the forest demo,” notes Libreri. “It showcases the engine’s scalability and Nanite and Lumen’s ability to deal with light transmission through foliage. Unreal can do hard surfaces like rocks and mountains and organic forms like jungles well.”

Asked if he sees Unreal Engine taking an increasingly prominent role in traditional visual effects and animation, Libreri is highly optimistic. “You’re going to see a lot more realistic animation, for example, in manga styles,” he says. “If you’re doing something like The Last of Us with creatures, you can also do that work in Unreal Engine. It’s a game-changer for artists and visual effects houses.” 

Unreal made massive strides in virtual production, and according to Libreri, that innovation continues to evolve with each new version. “We have an entire team dedicated to virtual production, and it’s making good, steady progress,” Libreri observes. “The key to success with LED volumes is working with experienced engine developers and focusing on complete asset building well before production, just like you would with physical sets.”

The key to success with LED volumes is working with experienced engine developers and focusing on complete asset building well before production, just like you would with physical sets.

Kim Libreri, CTO Epic Games

On that account, the new Fab unified marketplace promises more efficient ways for filmmakers to discover camera-ready assets for virtual production. “The Quixel assets are already built to very high standards, but with Fab, we want to tag photorealistic, cinematic-quality assets to make it even easier to find them,” Libreri says. “Because it’s built around Sketchfab’s capabilities, you can spin and zoom 3D assets around as you browse.”

Another area Epic is looking to improve is the efficiency of rendering across multiple nodes for larger LED volumes. The process is currently handled using nDisplay to synchronize rendering machines. “We’re making it more efficient to move the data from the different machines onto the wall and to the frustum,” says Libreri. “With support for SMPTE-2110, latency will get lower, which means anyone shooting in a volume with dynamic cameras will achieve better results.”

With all of the current developments in generative AI art, Epic is observing while taking a measured approach to potentially incorporating it within the engine. “If you’re going to add AI into the workflow, it should benefit the creative community,” Libreri says. “It has to be something everyone that contributed knows and wants to contribute and can benefit from it. Fundamentally, making tools that eliminate drudgery and low-level technical details can be very impactful.”

Writer Noah Kadner with Epic Games CTO, Kim Libreri at GDC 2023.

Finally, I asked Libreri how the film and gaming industries compared when using the same creative tools like Unreal Engine. “Movies tend to have a hierarchy where the person at the top may be very far removed from where the creative magic happens,” Libreri observes. “Games tend to be a super collaborative medium where the smartest idea needs to win to make a successful game. It’s less a race to the bottom than a sense of joy and collaboration across everybody. There’s a beautiful balance between the engineers and the artists making the games.”

Until Next Time

I hadn’t had the opportunity to visit GDC before 2023, but it’s on my list to revisit for next year. I couldn’t help but notice how enthusiastic attendees and exhibitors were on the convention show floor. I’m also looking forward to playing with the new tools in Unreal Engine 5.2 for virtual production and seeing what happens next in this rapidly evolving ecosystem. 

Watch the complete State of Unreal presentation from GDC 2023.

-Presented By-