Go backBack to selection

NAB 2023: New Gear That’s Making Virtual Production More Affordable and Accessible

VIVE Mars Camera Tracker on display. With a toy parrot.

At NAB I had a mission: find out what new products are making virtual production more affordable, more accessible, and not require a master’s degree in Unreal.

Fortunately, I found a few options. Plus some new tools that are blending generative AI with virtual production.

Let’s dig into what’s new in VP from NAB.

Budget-Friendly Virtual Production

First is the VIVE Mars CamTrack—a complete package for creating an affordable virtual volume.

HTC’s VIVE has been in the virtual reality space for a while, with some of the top VR headsets. They also make the VIVE Tracker, a palm-sized puck that can track an object or limb in virtual space. Filmmakers realized you could attach a tracker to a camera and boom—instant volume to track a camera in a virtual scene.

Well, it’s actually not so boom

I personally went down this rabbit hole, spending hours on YouTube and Facebook groups trying to figure out how to get Unreal to recognize the VIVE Tracker and set up my makeshift virtual space. You have to jailbreak the software because the tracker only works when connected to a headset, which you don’t need for virtual production.

Fortunately, HTC realized the VP potential of their trackers and built a new product specifically for this use: the VIVE Mars CamTrack.

It uses the same existing trackers and base stations with some additional, new hardware. For $5,000 the Mars CamTrack comes with two trackers and two base stations, which can cover an area of about 10 x 10 meters.

The trackers connect to a rover (the tracker and rover combo mounts to your camera or whatever you want to track). The set comes with three rovers (that’s one extra rover, just in case you want to buy another tracker). Ethernet cables run from the rovers all the way to the brains of the operation, the Mars box. The box runs all the tracking operations and sends the positioning data over to the computer that’s running Unreal or Aximmetry. 

The advantage of this setup is all the tracking and processing power is being run separately from the computer that’s generating the virtual set. Also the large touchscreen makes setting up the tracking space a breeze. And the whole setup is portable enough to fit in a Pelican case.

Previewed at NAB but not released yet is FIZTrack, a lens encoder that can track zoom, focus, or exposure and send that data over the same tracker box to the Mars. 

So we’ve got an affordable volume—what about loading a virtual scene without being an Unreal expert?

Enter ARwall and their upcoming ARFX StudioBox, a $4200 virtual studio in a box. It’s a portable computer that can send a real-time 4K feed to an LED wall, TV or projector for real-time compositing. Plus you’ll be able to use your phone as a tracker—just mount it on your camera.

But the computer isn’t the most interesting part. Their ARFX app is designed to make virtual production extremely easy. Load a scene from their library, navigate with a mouse or Xbox controller, adjust the lighting and weather and you’re good to go. 

As ARwall’s CEO Rene Amador describes it, “It’s like you’re running a video game…but the game you’re playing is making a movie.”

Now we’ve covered some budget options, but if you’re looking to level up your skillset for larger virtual productions, Mo-Sys has some options and new updates.

First up is their Mo-Sys Academy, a three or five-day course in LA or London that offers hands-on training in virtual production.

They also released a new, more portable version of their popular StarTracker camera tracker. StarTracker runs off a star map—a series of randomly placed tracker dots on the floor or ceiling which get mapped out to create the virtual space that the StarTracker, attached to the camera, reads. The advantage of this configuration is once a Starmap is created you don’t have to calibrate every time you boot up. 

AI Meets Virtual Production

2023 has been the year of generative AI going mainstream, and it’s already finding uses in virtual production.

Vū announced Virtual Studio by Vū, a series of upcoming tools for virtual production. One of the tools, Cuebric, uses generative AI to create 2.5D scenes that can be used as virtual backgrounds with parallax in a volume. This is similar to another tool that’s been making its way around Twitter from Blockade Labs, where you can draw out a rough diagram and turn it into a virtual space. 

AI isn’t just about text-to-image generation. Disguise announced a partnership with move.ai, which uses a series of cameras and AI to track people in real-time without the use of body suits or sensors. 

And Puget Systems, a custom computer builder which was powering a lot of the LED walls on display at the show, was showcasing a new use for their powerful, custom-built systems: generative AI. Their use case on display was how Digital Corridor used their systems to run Stable Diffusion to create their AI animated short film

Seeing Isn’t Believing

Lastly, a peak at what the future holds for virtual production was on display from GhostFrame

To the naked eye, the LED wall powered by GhostFrame looks like a regular scene. But happening on a sub-second level the wall (and connected lighting) are cycling through three (or more) different scenes, with the camera filtering out each look through an adjusted frame rate and shutter angle, so you get three independent outputs (or realities) with completely different lighting and backgrounds.

It’s not the easiest to explain in text, so check out the video demo to see it in action:

What are the use cases for this? On display in a private demo room was a process shot driving through Las Vegas where both daytime and nighttime were being recorded simultaneously. Another use is you could record your virtual set in camera but also have a feed with the wall as a green screen and another feed with hidden tracking markers, just in case you want to replace the set later in post.

On demo at the Disguise booth, using GhostFrame, was a live broadcast example, in which each camera would only record the corresponding angle in the virtual set on the same LED wall.

Future of Virtual

We’ve seen virtual production take off with large volumes creating fantastical worlds, whether it’s an alien planet in The Mandalorian or on an ocean liner in 1899But the future uses of VP are going to be much smaller…and more practical.

Jamie Clemens from Vū sees virtual production replacing practical locations because it provides what he calls the three C’s of virtual production: creativity, confidence, and control.

Nick Rivero from Disguise says they’ve been seeing an increase in smaller, 10-foot LED wall builds: “I think people are starting to realize that virtual production doesn’t just mean large volume, it means tool set.”


Joey Daoud is a media producer and founder of the agency New Territory Media. He also runs the free newsletter VP Land, covering virtual production and new video tech.

© 2024 Filmmaker Magazine. All Rights Reserved. A Publication of The Gotham