Go backBack to selection

Speaking Volumes: The Production Benefits and Challenges of Shooting on Virtual Stages

For Metropolis special effects artist Eugen Schüfftan, a model, a mirror and a sharp-edged tool were all the instruments required to create cinematic wonder in the 1920s. The mirror—placed at a 45-degree angle in front of the camera—reflected the image of a model cityscape located just out of frame. The tool then scraped away sections of the mirror’s reflective layer, leaving only glass and revealing strategically placed actors in the distance. When the mirror was filmed, the citizens of Metropolis now magically appeared to inhabit the colossal urban dystopia.

A century later, virtual production is the latest evolution in cinematic wonder. While the tools may be different—with towering walls of LED panels and game engine technology replacing models and mirrors—the end goal remains the same: to create dazzling in-camera composites on set.

A myriad of techniques formerly strove for that same objective, from glass matte paintings to forced perspective trickery to rear projection. Those techniques long ago fell out of fashion in favor of chroma keying, where green or blue screens are used on set alongside practical elements and then replaced in post by either photographic plates or computer-generated imagery.

Virtual production seeks to return more of that compositing process to the principal photography phase, using LED-adorned stages known as volumes with screens capable of displaying photoreal content to create real-time, in-camera finals. The advantages of virtual production extend across the call sheet. Actors can perform inside dynamic environments rather than vast green voids. Editors can cut scenes without waiting months for VFX plates to trickle in from a multitude of vendors. Cinematographers can claw back some of the control over composition and lighting they’ve ceded to post while basking in the volume’s interactive illumination.

Need to shoot scenes on a subway car and a distant alien world? With virtual production, you can spend the morning in the bowels of New York City’s transit system and be on Neptune by lunch. “In a matter of 10 seconds, you can be in another world,” says visual effects veteran and CEO/founder of Stargate Studios Sam Nicholson. “If you can afford to actually go to Paris for a one-page scene, it’s not going to get any better than that. But there are tremendous advantages to being able to change sets, locations and time of day at the push of a button.”

Want to shoot at magic hour without mother nature’s mercilessly ticking clock? The volume can provide perpetual dawn and dusk. “The number one enemy of the cinematographer is time. That’s all we’re fighting, all day long,” says Oscar-winning DP Erik Messerschmidt, who used virtual production on Mindhunter, Mank and the recent Korean War fighter pilot epic Devotion. “Every time I’m on a set, I wish I could pause or fast-forward the sun or tell it where to go.”

The formative period of virtual production traces back to 2013, when several projects used components now vital to the process. The first season of Netflix’s House of Cards employed “poor man’s process” for its car work, shooting on stage with green screens surrounding the vehicle. However, rigged overhead, just out of frame, were LED screens playing background plates to provide dynamic interactive lighting on the actors and realistic reflections on the car’s surfaces.

In the post-apocalyptic sci-fi film Oblivion, cinematographer Claudio Miranda wrapped the set for protagonist Tom Cruise’s high-altitude, glass-enclosed outpost in a 500-foot-long piece of muslin. He then front-projected plates that were shot atop a Hawaiian volcano onto the material, allowing for real-time composites and providing the majority of the set’s lighting.

On Gravity, DP Emmanuel Lubezki created a 20-foot cube of LED panels that played previs animation. The screens were not of sufficient quality to serve as final backgrounds and were replaced in post, but the LEDs offered interactive lighting for stranded astronaut Sandra Bullock.

Virtual production’s seismic shift came with the 2019 release of The Mandalorian, the Disney+ series featuring an intergalactic bounty hunter and his tiny charge, affectionately dubbed Baby Yoda. Half of the initial season was filmed on a 20-foot high, 270-degree LED video wall at Manhattan Beach Studios—the space where the term “the volume” was first used. The show debuted Industrial Light & Magic’s StageCraft virtual production platform, which used Epic Games’ Unreal Engine to “drive” the LED wall content. StageCraft offered crucial new capabilities. The platform dynamically adjusted the LED screen displays as the camera moved, providing proper perspective shifts. StageCraft also offered real-time rendered, photo-realistic CG backgrounds that allowed the volume’s virtual environments to serve as “final pixel” in-camera composites.

The LED panels first used in volumes were repurposed from large-scale displays used for live events and billboards. “They used to sell you Coca-Cola on the side of the freeway,” says cinematographer David Klein, who joined the Star Wars streaming universe as the second unit DP on season two of The Mandalorian. He’s since served as main unit DP on episodes of The Book of Boba Fett, Skeleton Crew and season three of The Mandalorian. It’s a circuitous arc for Klein, who began his career with Kevin Smith’s lo-fi Clerks and now finds himself on the vanguard of production techniques. “This technology is so new that it changes minute to minute,” says Klein. “Every time I think I’m going to ask [the “brain bar” that drives the volume] a question that is stupid, the answer is, usually, ‘Nobody has ever asked us that. We don’t know if it can do that or not.’ And if it can’t, generally by the next day it can. We’re constantly changing the way that things are done.”

That perpetual advancement applies to the capabilities of the video walls as well, which have begun to incorporate more production-specific functionality. “The LED manufacturers are now understanding how we’re using their products, and they’re amenable to making adjustments,” says Messerschmidt. “There are all kinds of really technical problems that have been solved, but at this point the screens are still ostensibly repurposed signage.”

When selecting the best panels for a particular project, virtual production supervisor and Nicholson says a multitude of factors must be considered. “You start with asking how big your stage is going to be and how far away you’re going to work from the screen,” says Nicholson. “Then, you have to ask if you’re going to build a big set. What’s your field of view? Are you going to use wide lenses or long lenses? What’s your depth of field? If you’re going to shoot deep, you need higher resolution. What is your camera movement going to be? Is the scene a fight sequence or is it a walk and talk? You need a very high-frequency screen if you’re going to do something like a sword fight.”

A key component of the panels is their pixel pitch—the distance between the centers of neighboring pixels. The smaller the pixel pitch, the higher the panel’s resolution. Klein says the Star Wars shows typically use panels with both 2.8mm and 5.7mm pixel pitches. “The higher the resolution, the more you can get away with,” explains Klein. “Your focus can get closer to the screens before it breaks, and you have less issues with moiré.”

For a series like The Mandalorian, creating the virtual content to be displayed on the panels requires an elongated pre-production period. “It’s a five- or six-month process once the virtual art department starts to design it,” says Klein. “David Lowery, our lead storyboard artist, does storyboards for the entire episode, then the previs team will create a full animatic of the entire episode—full virtual environments, motion-captured actors, etc. It’s pretty wild. The director and I will then set up shots virtually within this animatic and also start pre-lighting, which is mostly done in VR on color-accurate monitors.”

Once Klein’s pre-light is complete, Mandalorian gaffer Jeff Webster—a veteran of nearly 40 episodes of the various Star Wars series—begins his work. “I generally come in a couple weeks sooner than I would for a regular prep so I can start to go into the VR space and check out the loads. I have to make sure the light sources David has put in are physically possible in the actual volume space,” says Webster. “Then, I’ll go in and see where they’ve placed the cameras and what lenses they’ve chosen. I need to know if, for example, we’re going to see the whole ceiling for a particular shot and if I might have to talk to VFX and see if they can paint out [some of our gear]. It’s a constant dialogue between all these different departments.”

In addition to three StageCraft volumes in greater Los Angeles, ILM has permanent facilities in Vancouver and Pinewood Studios in London. The company also creates custom pop-ups, such as the one at Fox Studios Australia used for Thor: Love and Thunder. The sprawling 700-acre Trilith Studios in Atlanta—home to many a Marvel production—also got into the virtual production game last year with its Prysm Stage. The fully enclosable volume made its maiden voyage with Francis Ford Coppola’s Megalopolis. Not to be outdone, Amazon opened Stage 15 in December, a 34,000 square foot facility in L.A. that includes a virtual location-scouting volume and a motion-capture volume. The stage’s video wall stands 26 feet tall, just a foot shorter than Pixomondo’s Vancouver volume. 

In addition to these cavernous facilities, hundreds of stages and vendors of all shapes and sizes have entered into the virtual production fray, making experienced technicians to drive the set-ups the most finite resource at the moment. “There’s no doubt that the operators are the most magical component of the whole thing,” says Nicholson, who’s begun hosting monthly training programs in Los Angeles to bolster their ranks. “Anybody can go out and buy a couple thousand panels and claim to be in virtual production, and it’s very easy for executives to write a check for the technology, but finding the right people to run it is the trick.”

Nicholson’s career began auspiciously in 1979 while he was still a grad student. His first movie job found him creating the kinetic lighting for the Enterprise’s multistory warp core on Star Trek: The Motion Picture. He founded Stargate Studios in 1989 with an emphasis on visual effects, but the company now offers virtual production capabilities as well. That includes working with a 165-foot-long, 30-foot-tall LED wall on the Warner Bros. lot for the pirate comedy Our Flag Means Death. “Every show is different, and we design a volume and create content to fit the show rather than the other way around,” Nicholson says. “We’re not asking productions to go offsite to our volume [so] we can make magic happen. We’re bringing the magic to them on their stage.”

On Station 19, an ABC drama set at a Seattle fire house, that magic frequently includes process shots. But Nicholson also employs virtual production for scenes atop the fire station’s balcony, where he uses a hybrid approach—green screen for the wide shots, volume loads for the mediums and close-ups. “There are still some things that are better on green screen. Like many other things, a mix is probably the best,” says Nicholson. “Big wide shots, crazy camera angles and fight scenes that require high-speed motion blur are all better on green screen. Dialogue coverage, which is probably at least 50 percent if not more of most visual effects budgets, is great for LED screens.”

The technology does come with limitations. The current incarnations of the screens still work best when shot slightly out of focus, which isn’t always ideal for the storytelling. “The shallower the depth of field and the longer the lens, the less problems you have,” says Messerschmidt. “But that also gives the audience less contextual information. When it’s appropriate for the shot, I want to push the wall and the technology to get it as in-focus as possible.”

Nicholson shares that aspiration. “Our goal is photorealism. That’s the line of demarcation,” he says. “I don’t want to put everything completely out of focus with shallow depth just to say, ‘Oh look, we got a composite.’ I want it as good as you would see on green screen, or really close to it.”

Believable focus fall-off within the virtual loads is also an art that volume walls have yet to master. “In the past, a lot of times the volume would break when virtual content had one thing that’s supposed to be 200 meters away and another one that’s supposed to be a kilometer away and they’re both defocused the same amount,” says Klein.

Creating believable day exterior sunlight is also problematic. “You still can’t do open sunlight in the volume. I don’t know if you’ll ever be able to, because you only have so much space.” Says Klein, “100 feet by 80 feet gets very small very fast. You can’t get lights back far enough to fill the volume. Also, with that much manufactured sunlight, you would wash out the screens and render them useless. So, anytime we’re doing direct sunlight, we’re on the backlot.”

Webster points out another issue with daylight exteriors: As actors move across the volume, the “sunlight” gets brighter as they approach the walls. “There’s a sweet spot in the middle of the volume, and if you move outside of that you’re just going to keep getting brighter and brighter,” says Webster. “I don’t know if an audience would [consciously] pick up on that, but I think they would feel that something isn’t right.”

Though he didn’t have to deal with wide day exterior work in the volume on Devotion, Messerschmidt did have to come up with a novel approach for creating sunlight for the film’s virtual cockpit scenes. Messerschmidt and longtime gaffer Danny Gonzalez devised a plan to take a bare Arri T12 bulb, place it within a shadow box and affix it to a Technocrane. Enter electrician Evan Houth, who Messerschmidt describes as “the sort of guy that travels with a lathe.”

“He’s just this incredible tinkerer,” says Messerschmidt. “He already had sheet metal tools in his hotel room, so he went out and bought a bunch of tin or aluminum sheeting, whatever the material was, and made our shadow box. They put a socket in it, painted it matte black and flew it around the plane for our sunlight.”

Like high-key situations, low light levels can create their own issues in the volume. “Everybody wants to light with a flashlight now. I love low-key lighting, too, but the walls don’t like to burn low,” says Nicholson. “They will literally start strobing very fast, and depending on the frequency of your wall, you’ll start seeing a lot of artifacts.”

Regardless of the luminance level, gaffer Webster says the LED panels not seen on camera play a key role as well. Those “light cards” can be pumped up for extra punch or blacked out for negative fill. “We’re doing some sort of card on every shot. I have my own iPad control where I can spawn a shape and manipulate where I want it to go. So, for example, if I want a little rim light on an actor, I’ll just do a skinny card right over the top of them, right off camera,” explains Webster. “It’s a dream for a gaffer because it’s like you’ve rigged the entire stage with 270 degrees of controllable light.”

If not wielded carefully, that level of control runs the risk of creating images so perfect they seem artificial. Messerschmidt was aware of that danger on Devotion, where he mixed cockpit scenes shot practically in period aircraft with those lensed on a virtual stage at Trilith. “When you shoot aerial footage for real, it’s messy. You can’t always control the sun position. You’re not always going to be perfectly backlit. The background isn’t always going to be exactly where you want it,” says Messerschmidt. “Once we got in the volume, we were able to really manicure everything. So, we had to say, ‘Let’s mess it up a little bit,’ because the very nature of real location shooting is that it is imperfect. That’s what we were trying to emulate with the virtual production work.”

Messerschmidt applied a similar philosophy to his volume camera placement. “We didn’t want to do the Chitty Chitty Bang Bang thing where the camera is flying all around,” he says. “We only wanted to put cameras in places where you could really put them if you were going to actually fly the plane.”

Like any production decision, financial as well as artistic considerations must be accounted for when weighing whether virtual production is the appropriate tool. “It’s generally a question of whether it’s going to be more cost effective to build it [as a physical set] or to create it virtually,” says Klein. “It’s not a cheap process to build virtual environments and get them on the screens. I call volume work ‘rich man’s process.’”

While a Mandalorian-sized volume and computer-generated fantastical world to populate the LED walls may be out of the price range of less deep-pocketed productions, Klein says smaller versions of the technology are increasingly scalable for indie budgets. “I think all new tools will eventually be appropriated to different budget ranges in this business,” he says. “I don’t think a full volume with all the content creation that comes along with it is going to work for many low-budget independent films, but smaller, portable walls and pop-up partial volumes could absolutely be valid for any size project.”

Just because virtual production might make fiscal or logistical sense for a production doesn’t mean the technology is ideally suited for every circumstance. “My experience with virtual production has often been that people think it’s a turnkey solution to every problem, then get into problems that the volume doesn’t solve and have to manipulate their storytelling techniques to shoehorn them into a virtual production space,” says Messerschmidt. “That’s a really awkward way to work. I don’t like being put in a situation where the technology is dictating the storytelling. I think the storytelling should dictate the technology.”

© 2024 Filmmaker Magazine. All Rights Reserved. A Publication of The Gotham