“Each Episode is Like Its Own Movie”: DP Benji Bakshi on Star Trek: Strange New Worlds
In December of 1964, principal photography finished on the pilot of Star Trek, featuring captain Christopher Pike (played by The Searchers’ Jeffrey Hunter) as the commander of the Enterprise. When the show’s first episode finally aired almost two years later, Pike was nowhere to be found. The initial pilot had been scrapped and re-shot, with William Shatner’s James T. Kirk taking the helm and a different crew boldly going where no man had gone before.
However, that wasn’t the end of Christopher Pike.
The character returned as a Kirk mentor in the J.J. Abrams-directed reboot films. Now, with the Paramount Plus series Star Trek: Strange New Worlds, Pike (played by Hell on Wheels’ Anson Mount) is back in command of the Enterprise for a prequel set roughly ten years before the original series.
The show harkens back to the episodic nature of that original voyage, meaning shifts not only in genre (everything from horror to fairytale to courtroom drama), but big swings like a partially animated Lower Decks crossover and a musical episode.
That song-and-dance Trek—along with three other episodes—were shot by Benji Bakshi. The cinematographer, whose previous efforts include S. Craig Zahler’s Bone Tomahawk, Brawl in Cell Block 99 and Dragged Across Concrete, spoke to Filmmaker about his work on Strange New Worlds’ recently concluded second season.
Filmmaker: It’s been a while since I’ve watched a narrative show with commercial breaks, but when I went through Strange New Worlds on Paramount+ there were brief pauses and cuts to black where the commercials would normally go. How do you come up with a variety of ways to visually get into and out of those act breaks?
Bakshi: On other shows I’ve done that were broadcast on network, those moments were a lot more heavily weighted. On [Strange New Worlds], I think we’re conscious that this is the end of an act, but I don’t think we’re planning too heavily around it or making a big deal out of it. A lot of the time those moments are editorial choices and it’s really the sensibility of the edit that determines how hard they want to hit that. Like I said, in other shows we would be like, “Here is our act [break]!,” because you’re trying to create a cliffhanger or some momentum that carries through the ads.
Filmmaker: Tell me about your approach to lighting the bridge of the Enterprise. Like all of the ship’s spaces on the show, it’s full of environmental lighting built into the set.
Bakshi: The bridge was actually repurposed from a preexisting set and modified to be a closer match to the original Enterprise. So, it wasn’t custom built entirely for this show like all the other sets, which have embedded lighting that is a lot more intuitive for cinematographers. We nicknamed that set The Nightly News, because we had a bunch of preset lights aimed in certain positions for certain marks. That was the only set in the entire show that had that. And Captain Pike’s seat, as you can see [in the photo above], doesn’t sit in the center of that donut overtop of it. In fact, it sits in a place that’s especially difficult for any of those [preset] lights to hit. All the rest of the sets have more environmental ambient zones that characters can play in and out of. Because of that, the bridge is a contrastier space and the actors have to find the pockets of light. So, we find ourselves setting up a lot of lights on stands and having to definitively light each shot because of where everybody is in that particular shot.
On the bridge we also have these pin spots that we mostly use to flare the lens, but we also use them for other things like adding contrast or for backlights. They can be a challenge, because when they’re on camera and blasting throughout the set, they can do things you don’t want them to do. If the camera is moving, we’re often turning those specials off or dimming them. If they’re off camera, we’ll usually just leave them off the whole time. We spend a lot of time on those specials, aiming them in different spots because they articulate. They are MR16s, so they get really hot. So, we like to wait until we get close to shooting to turn them on, then we’ve got like four seconds to aim them before they start burning our hands. (laughs) It’s kind of a silly process, but necessary, because those specials are really like the secret sauce for the bridge.
Filmmaker: A lot of the sets have hard ceilings, which are seen in these beautiful wide field of view shots that show off the spaces in deep focus. You shot with the Arri Alexa LF and Cooke Anamorphic/i Full Frame Plus Special Flare lenses. What focal lengths did you tend to use for those wide shots?
Bakshi: On the full frame anamorphics the 32mm is the widest, then the next widest is the 40mm, which tended to be the go-to for us. The 32mm has some really beautiful rectilinear corrections on the edge, but the perspective is exaggerated. If something is close in the foreground, it really feels gigantic and people disappear really fast [if they are moving away from the camera].
Filmmaker: When you’re seeing those sets in wide shots, there really aren’t many places to hide movie lights. How much are you leaning on that built-in environmental lighting?
Bakshi: A lot of those sets are lit almost entirely “by the ship.” That becomes your baseline and it’s very easy to light your actors with something off camera, which sometimes means using something for the eyes or to create a little more contrast. It’s a joy to be able to just use what’s around you and let it be. I like to joke about it being perfect for the show and for cinematography to have surfaces that are illuminated, but, like, practically in real life, I don’t think you would really want to have bright lights in your face all the time. [laughs] But it looks great on camera.
Filmmaker: In an interview, you talked about how on the second season of the show there was a new capability for the virtual production volumes you shot on in Toronto, where you could put virtual lighting sources on the same DMX controls as your practical sources. So, in the engineering room for example, you can isolate a digital light source in the virtual set extension and control the brightness or color of that virtual source just like a physical light.
Bakshi: There’s really too many digital sources for every one of them to automatically be given a DMX ID, so we would decide per episode what digital sources we wanted to have IDs and from there the Unreal team [which controls the imagery on the volume walls] gives them over, if you will, to the dimmer board operator. For example, the lights in the tilted vertical energy tubes in engineering—I think there’s an official word for them that escapes me at the moment—our dimmer board op could adjust the speed of those, because they sort of pulse. We’re in contact a lot with the Unreal team on set. I’m talking to the on-set gaffer, I’m talking to the dimmer board op and I’m also talking to the Unreal team about overall tone and levels and color, especially during the blend process [where the physical set is blended into the virtual background on the volume’s LED walls]. It’s really convenient to have that capability with the dimmer board op, because then when we get into finessing the lighting, we’re doing it all together instead of talking to different teams and having multiple conversations.
Filmmaker: Something else you said in an interview about working on the volume that makes intuitive sense, but I’d never really considered, is that if you’re shooting multicam, it’s hard to shoot different sizes of the same angle, because the background perspective on the walls can only properly display one perspective. So, one of the cameras would be off.
Bakshi: We have successfully done two cameras side-by-side in the same direction, but pretty much only when one of the lenses is really long and essentially the background doesn’t move in relation to that specific camera. However, you’re losing some of the effect and it’s kind of a cheat, so it’s rarely done. We might try to find a place to put a second camera, and sometimes we do and it’s worthwhile, but the majority of the time on the volume we find ourselves just using one camera. The technology is evolving even within the season. I heard that in season one they were experiencing things like stuttering if the camera was moving too fast, or if the shot was too wide the computers had to do that much more rendering on the fly. Those sorts of things definitely were smoothed out on season two and I rarely experienced them.
Filmmaker: The crew quarters on the Enterprise have windows that face out into space. How did you think about the light that comes through those windows and what the motivating sources might be?
Bakshi: It was always a conceptual discussion. What do we want it to feel like, and then, what source is shining through? In episode 202, it’s the Earth’s sunlight, because the ship is orbiting Earth. For episode 204, the ship is near Rigel VII, so the light in the crew quarters has a bluer, cooler feel. One thing that is common in all those rooms is that the sources are generally aiming down through the windows. That doesn’t necessarily [make logical sense on the Enterprise]—a hard source could be coming from below [if, for example, the ship was passing over top of a celestial light source]—but because we live on Earth and mainly experience lighting coming through windows from above, it feels artificial to have the source from below, the same way that a double shadow might. It just doesn’t feel natural to us.
Filmmaker: For a volume-shot set like the bar in episode 202, how do you control the color separation? You have warm light on the actors’ faces from the light built into the table, but you also have a cool backlight and hair light. How to you keep that from all bleeding together?
Bakshi: You use off-camera shapes on the volume walls. You can create different shapes, stretch them, move them around, adjust the intensity up or down and change the color. It’s a whole new universe and it’s really invigorating. To me, the greatest thing about this technology is that it is still in camera. The instincts that cinematographers have honed for a hundred-plus years are still the primary skill set and it’s not this feeling of things being run by computers. We’re also still setting up flags. We’re setting up off-camera lights. So, going back to your question, the cool hair light and shoulder light is not actually spill from the Earth itself [on the LED wall]. It’s from a blue shape up over the camera that’s mimicking the color of the Earth, but it’s way more intense. So, that’s how you control it. The background isn’t actually overpowering the lighting. It creates a base level, but you’re deciding how to sculpt it using these shapes.
Filmmaker: I’ve heard those shapes referred to as lighting cards. Is the vernacular of working in virtual production spaces still evolving?
Bakshi: You can call them light cards. When they’re rectangular, they do look like a bounce card, but I usually call them shapes. Like, “Hey, let’s put a shape over here.” We’ll use erratic shapes that have broken-up patterns. There was something about those shapes that felt more natural, where the shape wasn’t so flat and continuous. They might look like an amoeba or a tree dapple or something.
Filmmaker: Let’s break down some of your specific episodes. Your first episode, 202, is essentially a courtroom drama. Did you look at any particular films in prep to figure out how to blend that genre with Star Trek?
Bakshi: We looked at To Kill a Mockingbird and A Few Good Men. One of the things I loved about working on the show is that each episode is its own genre and the creative approach, in a lot of ways, was “anything goes.” We were encouraged to do that by the show’s producing director Chris Fisher and the showrunners Akiva Goldsman and Henry Alonso Myers. Each episode is like its own movie. There is a certain continuity that had to make sense—the bridge can’t all of a sudden be a rainbow circus for no reason—but for every episode, on every set, I would start from scratch in prep and set new looks for everything.
Filmmaker: I see a telescopic Scorpio 45’ crane with a stabilized head in a lot of the behind-the-scenes photos. Did you use that frequently on the bigger sets?
Bakshi: We carried that crane with a Matrix head for the run of show. We absolutely loved it. Even though I came to the show in season two, I learned a lot from what happened on season one. They used an older stabilized head for the first season that had a hard time getting through some of the spaces, particularly the bridge, where you really had to thread some needles. With the Matrix head, we could extend it straight out and minimize the needed height for the camera to pass through. We could also arm it out over a tabletop where we would normally need to be offset, because the head could reach out a few feet. That setup was definitely our workhorse. I would say we were on the crane at some point almost every other day.
Filmmaker: You mentioned Rigel VII before, which is the setting for episode 204. The exterior scenes when the away team first arrives on the planet seem like a perfect example of the advantages virtual production can offer. Unlike a practical location, you have an infinite dusk to shoot in. And unlike greenscreen, you can adjust the topography of the planet to get the exact composition you want.
Bakshi: Absolutely. There are some elements that are still locked in—if you have practical mounds of rock or a six-foot high snow drift on the set, you’re still not moving those—but we were definitely rotating the digital worlds to place the actors in between the mountains so they wouldn’t get lost against the background. Even on those sets we were still sometimes setting physical bottomers with flags in front of those large [LED wall] sources, so they didn’t spill over the ground, which was kind of interesting. You’ve got a digital source and you’re still setting up fabric for it, but lighting is lighting and the physics are what they are.
It takes about three to four months of lead time to prepare [those virtual sets before you shoot them]. That’s in conjunction with designing physical sets that need to be built that will seamlessly blend into the virtual ones. Along the way, we’re getting multiple chances to preview it on the wall and experience in person what the scale is like, and we’re always making adjustments. At first, you’re looking at it in a wide shot on your computer screen, assuming that the scale will translate a certain way, and then when you walk into the space, it feels a little different. So, we get the chance to really dial it in. We have a process called the virtual art department, or VAD, which is the cinematographers, the production designer and the entire art department, and the visual effects department, which is heavily involved in making these worlds. That process also includes the Unreal team, which is led by Pixomondo and their artists and producers. All of those people—and the showrunners, writers and episode directors—get the chance to have their voices heard during that process.
Filmmaker: In terms of differentiating the episodes, the style of camera movement also plays a role. Some episodes are more on the stabilized head, while others lean into handheld.
Bakshi: We had great operators—François Daignault, Mathew Cree and Peter Sweeney. I can’t say enough good things about them. They all have their own eye and their own ideas on shots, know how to pull them off and all frame really beautifully. One of the tricks for the handheld aesthetic for us was that the Matrix head has a handheld mode, which I didn’t know before. There are five or six different sliders you can dial in that affect different things. You can adjust how far it might oscillate left or right, up and down. You can adjust the roll axis. But it really has a bit of a life of its own and sometimes it would drift, and the operator literally had to pan it back into proper framing. In a way that was a good thing, because those adjustments are what makes handheld feel human. It’s definitely not the same thing as a person with the camera on their shoulder—we preferred to always have that—but it was an interesting tool for certain situations. An example [in episode 204[ is a shot where [Enterprise pilot] Erica Ortegas is in her quarters and asteroids are hitting the ship and all is sort of lost. That whole sequence is handheld. That was also a stylistic choice throughout the episode. As the crew—both on Rigel VII and on the ship—lose their memories, we become more handheld and chaotic. Then, as things solidify and their memory returns, we’re back to stability. There’s a specific moment where Ortegas leaves her quarters and is walking down the hallway saying, “My name is Erica Ortegas. I fly the ship.” During that shot we go from some pretty wild handheld on the Matrix head into stability. We basically fade it from one to the other. That was an exciting creative moment for us.
Filmmaker: Let’s finish up with episode 209, where the crew ends up in a subspace fold that causes them to periodically break into song when their emotions become heightened. Is there a musical number that you’d particularly like to talk about?
Bakshi: I would say Spock’s song, just because I haven’t highlighted that one yet [in other interviews].
Filmmaker: The number in engineering where he sings about being the “ex”?
Bakshi: Yeah, that’s the one. I really loved that song. We got to shoot engineering in a new way where we were off to the side of the railing, which was technically tricky because we were very close to the wall and as you get closer it can moiré and distort. We were definitely pushing the limits. That whole song is only maybe four or five shots, and they piece together seamlessly with Spock’s movement. There was a lot of discussion in prep about whether we wanted to go really stylized. The way he’s singing this ballad with this 80s synth vibe had a Depeche Mode feel. But, overall, with the musical episode, the feeling was that these are not music videos. They’re scenes and the singing happens when the emotion reaches a certain level that, in the rules of musicals, singing just sort of happens. At the end of the song, he’s standing there breaking down into tears and there was this happy accident that me and the director Dermott Downs really fell in love with. The light from the console spread vertically up and down his face and bisected it and, because he’s this half human/half Vulcan who’s constantly wrestling with that. We thought that bisecting light was really poetic, and we just went with it.