Go backBack to selection

Reaching the Peak: Exploring Virtual Production on an Independent Budget

On the virtual set of HIGHOn the virtual set of HIGH

Given our tech-driven and communication-obsessed culture, it’s highly likely that you’re reading this article while multitasking on your smartphone. But as is the case with so many commodity industries like data, the true cost of all this connectivity often eludes us. This disconnect is what drove us to write the film HIGH, set in the fascinating and rarely seen world of telecom tower climbers. In the aftermath of a tragic accident, team foreman Butch Robbins leads his crew through the brutal Buffalo winter to finish their job on deadline and save the company, all without losing the connection he needs most—to his family back home. 

In the summer of 2014, Tisha Robinson-Daly was working in that industry as a project manager when a friend and colleague reported news of a harrowing incident: In the haze of that dense Kentucky summer, Joel Metz (28) and his team worked diligently on the scalding steel to replace an antenna atop a 240-foot tower. Suddenly, a cable snapped. Joel, father of four, was killed instantly, and his body hung in its harness for more than five hours while emergency workers struggled to bring him back down. 

As a telecom worker, storyteller and citizen, this was a transformative moment for Tisha—Joel’s tragic story and the grief his partner and four young sons experienced was deeply affecting. Determined to shed more light on this hidden world, Tisha’s research soon uncovered a grim reality—climbers regularly faced brutal deaths (OSHA called it the most dangerous job in America), but their working conditions often remained underreported and shrouded in silence.

Over the next few years, Tisha shared stories through social media and documentary work to advocate for climbers via her nonprofit, “HIGH the Movement.” In 2017, she workshopped an early draft of HIGH’s script at the Sundance Screenwriters Intensive, where Tisha first met fellow Philadelphia-based filmmaker, now long-time collaborator and this article’s co-author, Jonathan Mason. From then, the two of us set our goal to write and direct together a gripping, cinematic story about the price of communication and preserving the family unit in America today. 

On this journey so far, we’ve found invaluable development support from partners like EPs Hannah Weyer and Tony Yang, David Rocchio and the team at Stowe Story Labs, Sundance/Knight Foundation and Tribeca. But when it finally came time to inch the film toward production, one question remained on every potential collaborator and financier’s lips: How on earth are you planning to shoot atop 300-foot towers with an independent film budget? 

EXPLORING VIRTUAL PRODUCTION ON AN INDEPENDENT BUDGET

We first heard the term “virtual production” as it related to the many VFX-heavy, world-building shows like The Mandalorian that hit streamers a few years ago. The trade publications and production blogs all raved about the show’s emerging tech in the same incomprehensible language used to describe the perilous sandy Sarlacc Pits of Carkoon, and all of it was above our heads.

Then, in 2021, Tribeca launched a new initiative with Epic Games called “Writing in Unreal,” a month-long virtual production lab described as “pushing the boundaries of storytelling in film.” The idea was to teach screenwriters the Unreal Engine toolset and further examine the ways it could make independent filmmakers rethink their ideation process. We were lucky to make the inaugural cut and it was, simply put, a paradigm shift. 

We’d already played with Matt Workman’s incredible UE-based Cine Tracer app on Steam, a “game” that allows you to test camera/lighting in real time. In similar fashion, we worked with the lab’s artist, Phil Donahue, to create a 3D, playable environment of a nighttime tower scene in which we could place, animate and light our sets and characters (“mannequins”). We had the ability to test just about any focal length, depth-of-field and film-back. A quick visit to the Unreal marketplace (where designers sell and trade assets) yielded a perfect model of a guyed tower for our scene. We found some rural landscapes and trees for free and used the built-in tools to modify the sky and cloud cover. We built a few miles of rolling rural Pennsylvania hills for $40 plus tax. 

With our basic world designed and our character blocking in place, we used Unreal Engine’s Sequencer panel (essentially their NLE) to create rendered clips and edit them, resulting in full cut-scenes that could be output as movie files. But, unlike with a traditional NLE, we still had the ability to go back into a particular shot and change the lens, camera placement or blocking without affecting our edit, then re-render our output. This was a great and inexpensive way to test and share visual ideas. Simply put, previz is the single greatest asset when it comes to virtual production for indie filmmaking. It costs practically nothing and allows you to visualize and plan for just about anything. 

Unreal Engine itself is free. You simply download it from Epic’s website and hope your computer can handle it. They recommend a PC with some decent muscle, like a Quad-core Intel or AMD, 2.5 GHz or faster, with at the very least 8 gigabytes of RAM, though at the previz stage we ran it on a M1 Macbook Pro with acceptable results. As of Unreal Engine’s 5.2 update, Apple Silicon is now natively supported.

THE PLAN (IRL)

Because we’d never directed together, we toyed with the relatively traditional idea of shooting a live action short as a proof-of-concept. But that came with its own inherent risks, not to mention a protracted timeline. On the flip side, we felt that the fully Unreal-generated sequence we’d made at the Tribeca lab was a bit underbaked and didn’t quite allow us to demonstrate working with actors. 

So, the idea of shooting a hybrid proof-of-concept in front of a “volume” (an umbrella term that now largely refers to the LED wall itself) seemed to make sense. We figured we could raise about the same amount of money as a traditional short but get more bang for our buck, visually. In this initial phase, we also explored the idea of shooting drone-captured plate shots, or 2.5D imagery (layered mattes) to mimic parallax. But in crunching those numbers, it didn’t seem like we’d be saving much money by avoiding a volume shoot given the very specific location demands of our film. Our mission would be two-fold: Demonstrate the advantages of this specific technology, showing potential investors we could shoot in any condition cheaply, safely and quickly, and showcase our directing in a scene with real actors and its own mini-arc. 

A volume, unlike traditional greenscreen, would allow us to show parallax because the backgrounds track the camera and shift in 3D, mimicking the human eye’s perception of depth. We also liked the idea that the wall would be emitting most of the light on set, freeing up a lot of floor space and wrapping quite naturally around our actors—another thing greenscreen can’t do. And unlike a location shoot, if we wanted to, we could have golden hour all day long instead of that one panicked hour everyone is always chasing. 

We identified a three-page sequence from the script, wherein our protagonist ascends a tower in a blizzard, his partner 20 feet behind him under a veil of blinding ice and snow—all very visual and dramatic. Together, our characters attempt to dislodge a large nest made of razor-sharp branches obstructing an antenna. We’d shoot the same scene several times for day and night, and in different weather conditions (rain, sun, snow). 

THE BUDGET 

In theory, a volume was a great solution, but we’d need a team of experienced technicians and artists, or what is known as a “VAD” (virtual art department), as early as pre-production. These humans don’t come cheap and there is, as of this writing, a scarcity of them kicking around (though more and more private and academic programs are rapidly training experts to bridge that gap). That meant a budget would have to be accurately drawn as soon as possible to understand the steps that would need immediate cash flow. 

For this initial vision of our proof, our lead producer at the time (the tireless Gilana Lobel) came back with a very safe rough budget of around $80,000 for our one-week shoot (including COVID prevention costs, safety/stunts and standard contingency). Before you spit out your kombucha, there are a few things worth discussing about a VP budget. 

Preparation: Early on, it became clear that virtual production would not necessarily be a massive bottom-line cost saver. It turns out that’s a common misconception, especially if you’re talking about an LED wall, which is still quite pricey to rent and manage. On the flip side, traditional VFX/CGI are time-consuming and expensive. With VP, what you’re getting is better usage of time and more on-set control.

Geoff George, who would become our DP, agrees. “In indie productions, we’re often given a much shorter amount of time at any given location or stage because of the budget,” he says. “If we can preview our scene before ever stepping on set, we can really get through shots and setups more efficiently than if we were finding them on the day or just walking onto set with a basic shot list. Virtual production is not all about the volume.”

VP producer Ben Baker concurs: “The LED wall is just the high-end end-point of these workflows. What VP really does is bring a set of new tools and collaboration practices that until recently required you to set up a bespoke 3D render pipeline in something like Maya or equivalent, which requires specialized and less accessible infrastructure to indie filmmakers.”

People in the VP space love to repeat the truism, “Fix it in pre!” A lot of the costs you would typically see in post-production are moved to the pre-production stage, where you’re previzing and tech-vizing and creating all of your assets for the wall to be shot in camera later (ICVFX). This process allows you to simultaneously save money on tech scouts and travel costs because you can easily scan locations and then bring them into Unreal. Department heads can then review sets remotely and place cameras, lights and dressing, and take accurate measurements as needed. In our case, Kourosh Pirnazar used photogrammetry to scan an actual rural tower location in Pennsylvania we pulled from Google Earth. 

Control and Time: We did also make a practical budget to compare, drawing inspiration from the tower film Fall (2022, dir. Scott Mann), which shot its set pieces on the edge of a real cliff. But for our film, which has more complicated blocking, that approach added a whole host of limitations. Ultimately, if we could pull it off on budget, using a volume seemed to offer us the most creative control and safety, right on set. 

Though we’d need cashflow faster than on a traditional shoot, it also meant we could raise the money in chunks, and if for some reason we couldn’t raise enough for principal photography, we’d at least have all of the virtual assets done and saved on our drives. 

To give you a general sense, back before we had real vendor numbers, our initial rough top-sheet looked something like this for three days of load-in/testing, two days of principal photography and one of wrap (note that this does not include a stage rental):

At this point, fall 2022, we had about $0 (adjusted for inflation), but this preliminary budget gave us an idea of what we’d have to raise. Quotes for stages at the time ranged from $30,000 to $50,000 per day for rental, so obviously it was out of the question to pay MSRP. It’s often the nature of the indie beast to negotiate discounts and favors, banking on the goodwill of vendors to invest in projects they believe in. 

Ironically, many of the heavy-hitting virtual production artists we pitched to were attracted to our film’s photorealistic, indie drama nature. It was a fun challenge for them to problem solve at this budget level and to create outside the usual sci-fi/fantasy realm. Through Casey Baltes (VP for games and immersive at Tribeca Enterprises), we met Ben Baker, who, with James Blevins, co-founded Mesh. Ben and his partner are consultants and line producers exclusively in the virtual space. Ben read our script and was intrigued by the premise and also saw the value as a use-case for lower-budget filmmaking. Ben assembled a team pretty quickly, composed of virtual production supervisor Nhan Le and virtual art department lead Kourosh Pirnazar. Calling this pocket of the industry cult-like sounds too pejorative and mean-spirited, but boy are all of these wizards excited about their universe and genuinely eager to bring as many people into the fold as possible. The time and effort they afforded us was incredible. 

We also applied for a direct grant from Epic Games (they have a rolling program called the MegaGrant, which provides support to any industry using Unreal Engine). While we waited, we raised some seed funding (around $5,000) from Rowan University, where Jonathan teaches film, which allowed us to lock in our VAD for prep.

Ben Baker and our fearless producer, James Yi, began talking to potential stage partners: Virtual Production House Toronto (whose team was helpful and generous with their expertise), as well as Carstage in Long Island City (co-founded by indie VFX veteran Josep White). Both of these stages are at the forefront of virtual production but also incredibly filmmaker- and story-driven. In the end, we were tipped into a decision by our lack of funds and a bit of good luck. Through Ben’s contacts at media server company disguise and panel manufacturer ROE Visuals, we were generously offered the use of the disguise VP Accelerator Volume in Los Angeles for an entire week. Not only that, but Addy Ghani (who has the tongue-twisting title of “VP of VP” at disguise) got down in the mud himself and was instrumental in helping us pull off this proof-of-concept.

THE PREP

Once we had our stage locked in, we doubled our speed. It’s quite important to have real numbers in virtual production, and the fallacy of endless possibilities is a pretty evil siren call. Knowing that our wall was 13 feet tall and 30 feet wide allowed us to map our set in Unreal so that any camera angles we prevized stayed “on the volume.” It also allowed us to determine how tall or wide our set pieces could be when taking into account the height of our actors. 

By now, we’d added Geoff George to the team, a Chaldean-American warrior of a DP from Detroit. Geoff jumped right into the previz and infused some new ideas into the blocking. But as more concrete numbers and realities came into focus, it also became very clear that we’d bitten off more than we could chew. The test scene we’d pulled from the feature involved too many moving parts, and the stage we were gifted would not allow any practical weather elements. We’d raised about $25,000 in private equity through our executive producers and a fiscal sponsorship from Stowe Story Labs. But we were also turned down for the MegaGrant, which, in a sense, turned out to be the best thing that could have happened to us. 

With more details now in place, our budget had dropped from $80,000 to around $50,000 for our one-week shoot. Having less than half of that with the machine already running forced us to refocus. So, instead of picking the budget bone dry to stay the course, we re-wrote an entirely new script that used shots we knew we could realize based on what we’d learned so far about virtual production and the actual space we were shooting in. Like any proof or testing of new tech, there was a lot of learning along the way. 

We turned our scene into much more of a teaser or a collection of atemporal climbing shots, which build their own mini dramatic arc. We simplified, then simplified some more. No more rain, no more snow. No more stunts. We even removed production sound and dialogue, opting instead for a voiceover: the protagonist’s wife leaving him a voicemail (we created a sound mix later using SFX libraries and homemade foley). This guaranteed that no matter what we pulled off on the volume, our VO could be rewritten to adapt and narratively shape what we captured. Of course, we were disappointed to lose some of our more dramatic shots, but we figured stunts and practical weather weren’t what we were trying to test and prove at this stage. 

Again, one of the advantages of this virtual production workflow is that when we re-wrote, we didn’t have to get rid of any of our previous assets. We just had to tweak them. We could just turn “off” the background snow on the tree canopies and change the season to anything we wanted, then simply reposition our mannequins up and down the tower to shot list as needed. 

The breakdown of our $50,000 or so budget looked like this: Above the line came in around $10,000, below the line around $35,000, the rest being contingency, COVID and legal. Most of the cost savings came from removing those practical FX and locking in deals on gear and crew. We’d spent $5,000 getting started with our VAD, but we already had something concrete to show for it. So, we reapplied to the Epic grant with a bit more of a fleshed-out plan. We didn’t wait to hear back and kept our foot on the pedal. Either we’d crash into a wall, or we’d pull it off. 

With members in Los Angeles, Detroit, Toronto, New York and Philly, our team met on Zoom regularly and worked through dozens of iterations to get our final shot list locked in. From a director’s perspective, the learning curve was not particularly steep for those with even a basic understanding of Unreal Engine’s filmmaking toolset. As Koroush told us, “Unreal Engine VP tools were created for filmmakers, indie or not, and once filmmakers have the chance to learn the capabilities and how they can iterate on their vision, it becomes second nature.” 

A few months out, we were joined by production designer Rebekah Bukhbinder. Her VP experience (The Mandalorian, The Book of Boba Fett) meant she was able to jump into VAD meetings and be the link between the physical and virtual worlds. But she also understood our budget and was incredibly resourceful in designing set pieces that were beautiful, modular and budget-friendly. 

We decided on a triangular top platform, about four feet off the ground, and dressed each side with the exact same set of props (cables, antennas, junction boxes, etc.). That way, instead of having to move the set, which was difficult on a small stage, we could flip our actors’ placements and rotate the graphics on the wall for coverage. 

With guidance from key grip Amy Snell and a generous assist from the folks at MBS, we settled on using mod truss, which is sturdy enough to support the weight of two humans and is, well, modular. It also doesn’t look like concert truss, which we felt might be too recognizable. The rest was a combination of standard speed rail and set dressing we sourced from specialty prop shops in Los Angeles that deal with aeronautical junk. The overall dimensions of the set pieces had already been decided upon and tested in previz—another benefit of VP.

To cast, we reached out to local climbing gyms, such as Top Out Climbing in Santa Clarita. We found some incredible talent but ultimately cast our two actors, Sharmaarke Purcell and Laura Bellomo, via Backstage. Though neither had climbing experience and Sharmaarke even expressed a slight fear of heights, we felt an instant connection. The beauty of our tech-viz process was that we could actually show our actors a rendered video of our Unreal Engine metahuman mannequins in action. The tower platform would be low off the ground, but they’d be able to see the horizon in the distance and react to their environment in real time. They later told us that it gave them headspace to prepare and gain familiarity with the location. 

Much of our gear, including a practically free Sony VENICE, came through the incredibly supportive team at BECiNE. For lensing, Geoff George suggested Cooke’s Anamorphic/i FF smart lenses because “the bokeh and lens aberrations [of the anamorphics] add a patina that blends the foreground and background better than with sharper, spherical lenses.” To supplement this idea, we’d also use a handheld glass prism to add more of that somewhat disorienting feeling of being at vertiginous heights. For the aspect ratio, we settled on 2.39:1 to privilege the width of the frame and protect us from the relatively short height of the wall (13 ft). This wasn’t a sacrifice. We’d always discussed the idea of shooting wide so that the home scenes, drawing on the geometry of the claustrophobic spaces, could play out in contrast with the vastness of the tower landscapes.

The ball was rolling, and about two weeks before shooting, we received the news that our HIGH proof would be supported with a $30,000 MegaGrant from Unreal Engine. Disbursement, however, would likely take several months. Our producer scrambled to secure a gap loan to tide us over, and, in the end, EP Hannah Weyer stepped up and provided it. 

By the time we’d put out that fire, we learned that the Sony VENICE wasn’t going to work out. It required a team to come in and calibrate it, and the lenses, to the LED wall, and our dates/times couldn’t be given priority because the package was being gifted. We ended up using disguise’s already calibrated in-house workhorse, the Red V-Raptor 8K VV, which we paired with our Cooke i FF Anamorphics (1.8x squeeze) in 50mm, 75mm and 100mm.

THE SHOOT

The true beauty of virtual production is that with all of the prep work you’ve done, there are far fewer question marks on set. By the time we loaded in, we knew exactly where to place the camera, at what height and distance from the set piece to place it, where to stage the tower itself and more or less what additional lighting would be needed. Because the wall provides the environment and most of the lighting, we pretty much relied on a couple of Skypanels and some tubes, as well as one practical FAA beacon to motivate some closer moody lighting. The walls cast very broad, soft light, so Geoff and our gaffer Jeremy Graham mostly worked on shaping, bouncing and cutting. We also overexposed by ⅔ of a stop to protect shadows and brought everything down in the grade. We knew ahead of time that planning any hard daylight would work against us in terms of time and firepower, so we leaned into soft light scenarios. That meant that our lighting setups and resets were extremely quick. That’s not to say that hard light is impossible to achieve on a volume. It just wasn’t a good idea on a two-day, low-budget shoot where we were trying to get 24 shots on two different sets. 

Stage work is always more controlled, but we had a first-day confidence and energy that felt new. On set was the “brain bar,” which added Carlos Perez from disguise to our VAD team. And, because we were shooting on disguise’s VP accelerator stage, we had access to their workflow expertise, which was key to calibrating the lenses, the screen, practical lighting levels and the color profile we’d pre-determined (color on set was run by Dane Brehm, a legendary DIT and technologist). 

As we hit last looks for our first shot, Tisha received a message through her “HIGH the Movement” Facebook page from veteran climber Mike Flenz. “All of us who have climbed remember the first time we stood at the base of a 1,000-foot tower. […] As we stood inside looking up, the top plate seemed a mile away. […] We took that first step onto the bottom rung, with a thousand steps ahead of us. But we eventually got there, one step at a time.” Tisha read the message to James, who suggested we share it with the rest of the crew. It was a sweet moment, and a reminder of the story behind all of this noise on set. When she finished reading Mike’s words, there were a few gruff tears in the crew’s eyes: “You’ve just taken your first step on that ladder, and I can’t wait to see you standing on top of that plate.”

Picture was up and our actors harnessed up with gear on loan from technical advisor Bill Butler, a retired climber in Arkansas and our biggest ally in the climbing community. We moved quickly through our shot list. The environments we had seen and tested in prep looked as expected, and we tried to be disciplined about not tweaking them too much on set. We made small changes here and there: moving a background tower, changing the speed of blinking city lights or adding more clouds or stars to a night scene in a specific part of the frame. All of these changes could happen in relative real time. Several of our setups were shot as series, capturing the busy-work of climbing (hands, boots, harnesses), and we were able to switch from day to night to dusk from one setup to the next simply by toggling a few settings (maybe the “brain bar” would disagree). On our first day, we pulled off the construction of the tower-top and our 12 scheduled shots under two lighting conditions: five for night, seven lit for golden hour. We made our day with a few minutes to spare. 

Day two meant a set breakdown and the assembly of our tower leg. This was a simpler build than day one because the set was smaller and featured far less setdressing. We had 12 shots scheduled, this time in three different lighting scenarios (overcast day, dusk, and night). A few hours in, we ran into a computer glitch (an issue with a texture not pushing through and appearing on the wall as transparent), which shut us down for nearly two hours. We’d been warned that pushing any data-hungry program causes crashes, so we budgeted some time for this just in case. But this was our only real crash, and during this time, we rehearsed with our actors. The mood on set was quite serene. We had also planned our schedule to put all “bonus” shots at the end of the day, and by the time the system was back up, we hit our stride again quickly. 

Our actors were really struck by how helpful it was to see the world around them. You could see it in their eyes. Our camera team benefited as well, especially in terms of operating. We could see the background parallax shifts, and it made for a much more organic and present choreography. 

We made our second day, cutting one shot but sneaking in another before tail lights that involved some improvised practical set pieces (a couple of tree trunks held up off-screen by our producer). We shot that with blue screen and tracking and, because we already had our virtual world built in Unreal Engine, it was a cinch to slate in later. 

Two days. 57 total takes. 12 shots. Two set builds. One VFX shot. 

The two of us flew home the very next morning with a portable drive containing our proxies. Dane Brehm had already done a color pass on set and, save one shot, we didn’t have to wait for any VFX work because everything was done in camera. By the time we landed in South Philly, we already had an assembly. 

We were very happy with the footage, and ultimately, can confidently say we achieved what we set out to do: demonstrate safe and repeatable ways of shooting the tower scenes on a reasonable budget, while still retaining as much creative control as possible. 

Virtual production, especially volume shooting, is not for everything. But it can be for everyone. In our specific case, we’re now confident it’s the right tool to visually translate “the high,” as our climbers call it with perhaps a touch of tongue-in-cheek ambivalence. But VP is not a magic wand. After our experience, we don’t estimate it will necessarily save you money. It will, however, reallocate your use of time and make it more focused and productive. 

We now view VP as a new set of tools: Some of these will apply, some won’t. Being judicious about its use is ultimately what will decide if it’s appropriate for your film. Otherwise, VP and LED volumes just become the auto-tune of the VFX world, overused and underconsidered. And even T-Pain knows that to write a good song, you still have to be able to sing a capella. 

TAKEAWAYS

A few general rules we think might be useful to filmmakers considering this workflow:

1) Think about why your project specifically benefits from a volume wall as opposed to camera tracking, plate shots or greenscreen, and make that your rallying cry when you appeal to potential partners.

2) Invest in the very best VAD you can possibly afford as early as possible.

3) If you don’t know your stage dimensions from the jump, source real ones as a placeholder. Those constraints will keep you in check. 

4) If you’re going to shoot deep background elements and have very little UE-generated content close to the camera or actors, you might want to consider traditional mattes/plates on your wall. Building out 3D elements that appear to be miles away will choke your system and look flat anyway because you won’t benefit from parallax. 

5) Avoid hard light scenarios if you can. Play into the benefits of the LED wall or know its limitations and use those creatively. 

6) Most affordable panels aren’t high-res enough to shoot from close up or with deep depth of field. You’ll want to plan for background shots that are always slightly soft, or shoot from further away if you can. 

7) Don’t make too many environment changes on set. That’s what will slow you down. The temptation will be there, but remember instead: 

8) Fix it in pre

9) Fix it in pre

10) Fix it in pre

© 2024 Filmmaker Magazine. All Rights Reserved. A Publication of The Gotham