“If You’re in the Right Location at the Right Time, You Shouldn’t Need Much Light”: DP Greig Fraser on The Mandalorian
Laboring in a greenscreen expanse for months on end never seemed like a particularly pleasant way of working. Not for the crew, confined to a windowless stage with walls roughly the same hue as green Tropical Skittles. Not for the actors, performing in a world they can’t see. And not for the cinematographer, surrendering control of the background that will ultimately replace the verdant swath of green.
StageCraft, a new technology that employs a vast array of LED video screens, provides an appealing alternative for capturing virtual environments. Created in partnership with Epic Games and Industrial Light & Magic, StageCraft offers photo-real backgrounds that blend seamlessly with partial sets, practical props and actors, while also supplying interactive lighting. Essentially, it’s a tool that allows for in-camera compositing without the need for post-production background replacement. StageCraft’s maiden voyage came on the first season of the Disney+ series The Mandalorian. The set—a 20-foot tall, 270-degree LED video wall comprised of 1,326 individual screens—was dubbed “The Volume.”
With season two set to debut Oct. 30th, Mandalorian co-cinematographer Greig Fraser—fresh off an Emmy win for the show— revisited the inaugural season with Filmmaker.
Filmmaker: The Mandalorian was one of the first original shows created for Disney+. Streaming services all have their own quirks in terms of what technical specs they mandate, whether it be HDR, 4K or certain aspect ratios. What were Disney’s parameters?
Fraser: Their mandates weren’t as strong as perhaps what we’ve heard about from the Netflixes of the world, though some of those mandates [are justified]. Netflix is pushing filmmakers to shoot in a resolution that is not going to be seen as obsolete in ten years. I believe the Star Wars prequels were done in 1080p. People at that time thought 1080p was kind of the be-all-and-end-all and that it was replacing film, but we know now that’s not the resolution that’s going to stick. I’m not necessarily a strong advocate of shooting in the highest possible resolution, otherwise I would’ve shot everything on an 8K Sony or an 8K Red—and I haven’t, so it’s not all about resolution. But I do want to shoot on the highest resolving format that I can that gives me the look that I’m after.
Filmmaker: I can understand the push for 4K – especially from a streaming service like Netflix that can charge a premium for that resolution – but the thing that drives me crazy is streaming’s distaste for 2.40 widescreen ratios. Was there any resistance to shooting that ratio on The Mandalorian?
Fraser: [Executive producer] Jon Favreau and I looked at 16:9 since Disney+ is a streaming service and we wanted the viewers to get the most out of their TV sets, but then we went, “You know what? Star Wars isn’t 16:9. Star Wars is 2.40.” Star Wars has a visual language and we decided to stick with that visual language. Nobody pushed back, from what I understand.
Filmmaker: The show’s look harkens back to classic westerns and samurai films. Were those things you would’ve come across growing up in Australia?
Fraser: Westerns were a thing for sure growing up. They probably weren’t at the height of the culture that they were perhaps here in the U.S., but I definitely saw Westerns as a kid and I’ve also seen the modern westerns like The Assassination of Jesse James [By the Coward Robert Ford].
We watched a number of westerns and Japanese samurai films in prep. We watched Sergio Leone movies like Once Upon a Time in the West. We watched Seven Samurai and a number of other Kurosawa movies. Westerns influenced a lot of filmmakers and they influenced George Lucas. You can tell by the way that he shot A New Hope, it’s very simple coverage and cinematography. There weren’t massive camera moves—they were very basic, very simple. That was definitely something that Jon and I spoke about at length, making it simple and not moving the camera beyond what it needed to do.
Filmmaker: Before we get into working with StageCraft, I want to talk about the previs process. You used a program that allows you to throw on a VR headset and basically enter a virtual version of your set with a virtual camera.
Fraser: It’s a very interesting process. It’s similar to doing it in the real world, except you’re sitting in a warehouse with a VR headset on and you don’t have interaction with your actors because [they have been recorded beforehand via motion capture], so they’re baked in. So you can’t say, “Move that actor a little bit to the left.” But if the action is, “Actor walks up to a door, knocks, someone opens the door and they go inside,” then you can put the [pre-recorded footage of the actors] doing that action on a loop. Then you can chose different camera angles and look at it over and over again [in this virtual space] without having to say to the actors, “Alright, go back to one and do it again please.”
Virtual shooting—actually the virtual world, for that matter—is a very powerful technology. I built a house a few years ago and wish we had that technology, because I could’ve literally walked around the house in VR and said, “I don’t think that bench is high enough,” or “I think that ceiling needs to be a bit higher.” It’s a really powerful tool for filmmakers, for architects, for anybody that wants to spend time in an environment before committing money to build it.
Filmmaker: So, for example, if you have a gunfight scene, the actors would record the action in motion capture beforehand and be put in this VR environment. You can then enter that environment and say, “Okay, show me what a 50mm lens would look like if I stand here.” Then, with the virtual camera, you can move around or crouch down and see what part of the set you’ll see from that vantage point.
Filmmaker: Is it more for set construction, then, as opposed to creating a shot list?
Fraser: It’s for all of that. It’s like you said: you crouch down, look up and say, “Ah man, we’re going to need to build a roof piece. That wasn’t in the original design, but we love this angle.” Or you say, “We can take away this part of the set because we’re never going to see it.” That’s been going on for years, but in a very lo-fi form. People used to do it with lipstick cameras and small models. It’s a really interesting process. We did a couple of episodes with that previs and it really informs decisions.
Filmmaker: In terms of using video screens, tell me about the leap forward made by The Mandalorian. Is the biggest advancement in StageCraft that it allows the screens to adjust perspective, so essentially the camera can move through the virtual set and the background shifts accordingly?
Fraser: Every step along the way with these LED screens has been a jump forward. If you go back, Chivo used them as interactive lighting on Gravity. Then Claudio Miranda used them for rear projection on Oblivion, which was another step forward because it was an entire environment that lit everything. On Rogue One we wrapped our space ships in LEDs, but the pictures on the screens weren’t final. They were replaced [in post] because they weren’t high-res enough. First Man used screens with better pixel pitch—now they could actually use the screens on camera, so what you’re seeing in the camera [is what you see in the finished film]. That’s when the technology became so good that you could point the camera at the LED screens and never know that it was a screen. But for First Man they didn’t have interactive perspective yet.
On Mandalorian, the thing that actually blew my mind when we first started working was that perspective shift. I was doing a test for the show six months before we shot, the ice world at the beginning of episode one. I’m walking behind Mandalorian on this wooden deck toward the LED screen. My right eye, which is looking through the viewfinder, is seeing the back of Mando, and the village he’s walking towards is a long way away. Then I opened my left eye and my left eye was seeing the wall [of the set] getting closer while my right eye was still seeing the correct perspective in camera [of the village far in the distance]. It was the first time I’d ever experienced it and I started to feel dizzy. I came very close to fainting because my two eyes were seeing two different things and my brain had to make sense of it. It shows you the power and the magic of fooling the viewer into believing that they’re in that environment, but actually it’s all shot in a space that’s only 30 feet wide.
Filmmaker: The benefits of “The Volume” over greenscreen make sense for a show like The Mandalorian: you’re not getting green spill and unwanted reflections from a lead character whose costume is basically a giant metal mirror, the actors have more to play off of than a giant green void, and you’re getting back more control of the frame by having the set extensions done in real time rather than just replacing the backgrounds in post. But what are the limitations of the technology at this point? What do you want to see in Version 2.0?
Fraser: There are a number of things inherently that it cannot do and probably never will be able to do. Well, never say never, I guess, but things I don’t think it will be able to do for a long time. Like sunny environments—bright, middle of the day desert sun is very hard to do. Getting the level of the sun from the LEDs is impossible. You could put a 20K on the actors and scientifically get the ratios correct between the backlight and the fill, but when you come out for a wide shot that person won’t be standing in a properly sunlit stage. And if you lit every part of that stage with the same amount of light, you’re going to need multiple sources.
But I’m looking forward to other DPs who are reading this and saying, “I know exactly how I can solve that.” I want to see them solve it so that the next time I use it, I can use those new techniques too. It’s important to keep evolving, to take what we’ve learned from The Mandalorian and keep going and keep improving: getting better quality light from LEDs, making a Volume of a different shape, using LED panels to replace set walls to create different concepts and different ideas.
Filmmaker: How do you approach lighting inside The Volume? I watched the “Making of” show on Disney+ and didn’t see many traditional film lights for scenes done in The Volume. I see that space and I think, “How do you control the light from those screens?” Isn’t it just spilling and bouncing all over the place?
Fraser: Yep. (laughs). But in terms of lighting, what happens when you go to the desert and you want to shoot your actor at sunset? What lights do you use there? Do you bring in an 18K? Probably not, right? You probably aren’t using [film lights] for that situation. If you choose the right time of day to shoot something in the right location, my opinion is that you often need very little lighting. You maybe need an eye light or a bounce card, something to give the actors a bit more fill or something, but if you’re in the right location at the right time, you shouldn’t need much light. Now, that’s my opinion based on the style of this show. It was a natural looking show. If you’re doing a heightened show, then your lighting style is going to be different. But if you create your background in The Volume correctly—the right time of day with the right intensity of sun—you shouldn’t need much light.
Filmmaker: I can see that for an exterior, because you’re basically replicating one giant source in the sun, but when you get into a dimly lit interior like a bar and you’re using The Volume, what do you do if you want to put a backlight hitting just the cheek of the actor? Would you just bring up a small section of LEDs behind the actor or would the audience be able to notice that it’s a different intensity than the surrounding panels?
Fraser: If you’re seeing the panel in that situation, chances are that panel isn’t lighting the subject. Let’s say you’re looking at someone front 3/4 and they’re having a conversation at a table at a bar, like you said. The panel that you’re seeing behind them —unless you’re on a really wide angle lens and you’re seeing 180 degrees—never influences the light on that character unless it’s a slight, soft back edge light. It’s the panels that are off-screen that are actually doing the lighting effect on your character.
If you’re doing a dimly lit bar scene, you’re 100 percent right—a lot of the light that is going to hit your character is going to come from practical lights in the shot, or maybe you would bring in a small back edge light to augment them. For example, in the bar scene in Chapter 4, which Baz Idoine DP’d, there’s lots of flashes of sun throughout the bar. For that Baz brought lights in from above. He put holes in the ceiling and put lights in from above. So, that was an exception where you can create slashes of light in The Volume. So, it’s not an overall rule of “you don’t use lights in The Volume,” but if you’re doing an exterior scene at the right time of day and you wouldn’t use lights ordinarily, then you’re not going to use lights [to shoot a similar scene] in The Volume.
Filmmaker: I was reading the American Cinematographer story about the show and it talks about how The Volume isn’t yet capable of displaying photo-real images over the entirety of all its screens. So, you have to use software that basically takes the lens you’re on and makes sure that the portion of the LED screens that is photo-real falls within that lens’ field of view. What you see within that field of view would be filmable, but the rest of the panels would only be providing interactive light. Does that prevent you from shooting multiple cameras? Because if you were cross-shooting a dialogue scene, you wouldn’t be able to have photo-real content on the screens behind both actors.
Fraser: That’s exactly right. That was a limitation for season one. There are multiple things to think of in that situation. Let’s say that the screen can react to the shifting perspective of one camera—well then, the perspective is going to be off on the second camera, isn’t it? There were times when, if we were on a long enough lens on the B-camera, you wouldn’t really have any concept of what the perspective was doing on the B-camera. So, you could get away with it, which happened often on Mandalorian.
Filmmaker: For something like the end of episode three, where the Mandalorian has to fight his way through a town full of bounty hunters, would you have to go out to a traditional set? Because you’re probably shooting four or five cameras to get that scene done on a TV schedule and using that many cameras would create too many problems with shifting perspectives if done in The Volume?
Fraser: That’s exactly what we did on that one. That was a street scene and we shot it outside for the exterior day portion, then moved that set inside the studio to do the end of the scene, which takes place at dusk so we needed there to be a level of consistent light.
We went through different ideas about how to shoot that on The Volume, but we couldn’t really go that way. There’s multiple points of views, there’s people falling, stunts, explosions, gunfire, squibs—there’s every reason under the sun not to shoot it on The Volume. The Volume is very powerful, but it’s not good for everything. I’m a firm believer that this technology is going to become a staple in filmmaking, sooner rather than later. A third of Mandalorian was shot on it, if not more. I don’t think necessarily everybody is going to shoot [that much of a production using StageCraft], but as people become more comfortable with this technology, it’s going to be more and more widely used.
Matt Mulcahey works as a DIT in the Midwest. He also writes about film on his blog Deep Fried Movies.