- Advertisement -

As we await the arrival in the UK and Europe of season one of The Mandalorian and hang on every snippet of news regarding season two, the reveal of ILM’s StageCraft technology has fascinated followers of visual effects, be they experts of casual observers. The realisation that there was no location shooting on the series and that all environments were created using LED technology is frankly stunning, and undoubtedly a major game-changer for the industry. FX Guide take a closer look at the genius behind StageCraft in the first of a two-part look at the series.

The new virtual production stage and workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using real-time game engine technology and surrounding LED screens. This approach allows dynamic photo-real digital landscapes and sets to be live while filming, which dramatically reduces the need for greenscreen and produces the closest thing we have seen to a working ‘Holo-deck’ style of technology. The process works as the camera films in mono and can dynamically update the background to match the perspectives and parallax a camera would record in real life. To do this the LED stage needs to work in combination with a motion capture volume which is aware of where the camera is and how it is moving. While this technology is producing stunning visuals for the Disney+ The Mandalorian, ILM is making its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies, and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space, where the practical set pieces were combined with digital extensions on the screens. Digital 3D environments created by ILM played back interactively on the LED walls, edited in real-time during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by NVIDIA GPUs. The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, with accurate interactive light from the LED screens lighting on the actors and practical sets inside the stage.

It’s a technically heavy, in-depth piece but well worth taking the time to read and absorb.