Virtual Backgrounds Return Control to the Set

In 2013, Alfonso Cuarón’s Gravity was testing the limits of filmmaking technology. To take audiences to space, Cuarón and director of photography Emmanuel Lubezki filmed Sandra Bullock inside a box made of LED panels. Cuarón and Lubezki each took home Academy Awards, among seven total Oscars for the film.

Jon Favreau advanced similar techniques on The Jungle Book and The Lion King. The LED-screen backdrops used to cast interactive light in Gravity had reached the point where they could be used for entire on-camera environments. The virtual production techniques depended on an LED 2.8 millimeter pixel pitch, down from the 9-mm pitch used on Rogue One: A Star Wars Story, where the backgrounds were later replaced with higher-resolution imagery.

On the production of The Mandalorian, the first live-action web TV series in the Star Wars universe, Favreau teamed with director of photography Greig Fraser (Lion, Snow White and the Huntsman, Zero Dark Thirty, Foxcatcher, Mary Magdalene and Rogue One). Co-shot by Barry Idoine, the Disney+ series – sometimes termed a Space Western – depicts a lone bounty hunter who operates far from any authority.

To take virtual production to the next level, ILM worked with Epic Games to adapt their Unreal Engine to enable real-time display at resolutions sufficient to make replacement unnecessary. Ironically, some of ILM’s original assets were brought out of the library and used, making a visual link to the earlier Star Wars imagery. Fraser says that one key to working with cutting-edge tools is to make sure they don’t become an end in themselves.

“If you base your decision on the technology side of things, that’s the tail wagging the dog,” says Fraser. “The technology is purely there to serve us as filmmakers. These tools often have to go through a process of adaptation. I want to be able to move the camera. I want to choose where the camera goes on the day – even in the moment that we’re shooting. Perhaps an actor does something different, and I do a little tracking to save the shot – and that becomes the magic part of the scene. So, we can’t run it like a robot. It’s not just committing a storyboard to film. It’s an organic process – that’s the exciting part.”

“It’s been brewing for a number of years,” says Fraser. “We did all the testing on Rogue One, and it was very much a conversation – could we do this with a real environment, and not just with ships in space? The answer was ‘not quite yet.’ We had moiré and other issues. Now it’s five years later. It was like a meeting of the minds. Jon was willing to risk writing the show based on the premise that we could shoot almost anything on the LED volume. It was a big step, and everyone put their reputations on the line. I can tell you it was one of the most beautifully stressful shows that I’ve ever worked on, because we were walking into the unknown.”

Fraser says that these decisions affected every aspect of the shoot, beginning on day 1 of prep. “Early on, it occurred to me that we were making 40 or 50 decisions every day that were brand new and groundbreaking,” he says. “Of course, you still have the standard general decisions that have worked their way through a century of filmmaking – What direction are we shooting? How do we stage this? But I was learning so much about the LED screen process, the manufacture, indoor versus outdoor, output, bit rates, dimming. And the LEDs were merely one aspect. There are hundreds of factors that we were navigating daily. As adults, it’s rare that we are learning on such an intense scale. It’s a fantastic feeling. But this was extreme. Every day when I went home, my head literally hurt! We were essentially inventing a new process of shooting.”

Fraser chose to shoot on ARRI ALEXA LF cameras in ARRIRAW capturing to CODEX Media. He chose the Panavision Ultra Vista lenses, which use a 1.65x squeeze to produce a 2.35:1 aspect ratio. The resolution was roughly a wash compared to the ALEXA 65, which would have required cropping left and right on the bigger sensor. Digital Imaging Technician Eduardo Eguia made sure that the workflow wasn’t a distraction. (See accompanying article)

Working with the LED volume brings control back into the hands of the cinematographer, according to Fraser. Since the dawn of the digital revolution, it seems, each step forward diluted the control directors of photography exercised over the image. Here, Fraser was able to work very closely with production designer Andrew Jones on every aspect of the backgrounds, not least the angle, intensity and quality of the light.

“The worst thing about being a cinematographer on a blue screen set is that you have literally no control over what goes on that blue screen,” he says. “You have to trust the visual effects supervisor and the director, and in most cases, of course, you do. But maybe somebody doesn’t understand framing, and they put a light post right behind the main actor’s head. You may not have shot something in a certain way if you had known what the background was going to be. But the LED volume restores visual power back to the cinematographer and the director, on their own set. To me, that’s the most powerful part of this.”

Fraser has been busy since The Mandalorian wrapped, shooting Dune, a feature directed by Denis Villaneuve, also “Captured on Codex” with the ARRI ALEXA LF, and was currently working on The Batman with director Matt Reeves until production was recently halted due to the COVID-19 crisis. That film is still planning release in June 2021.

Director(s): Dave Filoni, Rick Famuyiwa, Deborah Chow, Bryce Dallas Howard and Taika Waititi
DP(s): Greig Fraser, ASC, ACS (Season 1) and Baz Idoine (Season 1 & 2) with Matthew Jensen, ASC (Season 2)
DIT: Eduardo Eguia
Camera Rental: Panavision
VFX supervisor: Richard Bluff
Camera: ARRI ALEXA LF with ARRI ALEXA Mini LF added in Season 2
Lenses: Panavision Ultra Vista
Resolution: 4448x3096

Related product and workflows

This site uses cookies. Learn More.