See how 'The Mandalorian' used Unreal Engine for its real-time digital sets
Actors interact with the projected backgrounds rather than green screens.
It's not surprising that a VFX-heavy show like The Mandalorian uses digital sets, even though it also relies heavily on practical, in-camera effects. What's more unexpected, however, is that the actors were able to see and perform within those sets rather than against a sterile green screen environment. To do all that, Jon Favreau and his team work with Epic Games to develop a new, innovative technique using digital sets projected on LED displays, using the Fortnite creator's Unreal game engine.
In a new VFX sizzle reel (below) ILM, Favreau and other production team members explain how the cutting edge technique works. They built a 20-foot high, 270-degree semicircular LED video wall with a 75-foot diameter circular stage. ILM's digital 3D sets, built ahead of production (rather than later in post-production as is the norm on VFX-heavy shows) were projected interactively on the walls. Those could be used as stand-alone backgrounds or extensions to practical set pieces placed on the stage.
The digital sets weren't merely pre-rendered imagery, but game-type 3D objects rendered on the fly by powerful NVIDIA GPUs. They were lit and rendered from the physical camera's perspective to generate perspective, so that the sets didn't look like old-school rear screen projections often used in traveling vehicle shots. At the same time, the actors were lit with practical LED stage lights to match the position of lights and the sun on the digital sets.
The perspective aspect of the technique is on display at around the 3:40 mark of the video, where you can clearly see the background changing to match the camera movement. Since the camera is being moved by a dolly grip operator (rather than a computer motion control system) it appears that the digital set is linked to a motion tracker placed on the camera.
There are huge advantages to this technique, the team said. Sets can be changed on the fly (within an hour) to better match the director and cinematographer's vision, for one thing. It also makes performing easier, as actor's can see their environment rather than needing to pretend it's there as with green screens. Plus, it no doubt saves on post-production costs -- according to ILM, the technique was used in fully 50 percent of The Mandalorian's shots.
Best of all, the technique was seamless and invisible in the final show. ILM has built a whole new platform around this technique called StageCraft that uses Unreal Engine's real-time interactivity, and will make it available to other filmmakers and show runners. "We have been able to see through a few technical innovations and a few firsts that I think are going to have a lot of impact on the way that television and movies are made going forward," said Favreau.