The Week In Cameras
Latest News About the Tools We Use by Michael Murie
The Jaunt ONE — Development of a VR Camera
The past year has seen a lot of development in VR cameras – cameras that can shoot 360-degree video. Offerings range from inexpensive rigs for mounting multiple GoPros to custom units that cost many thousands of dollars. It’s an interesting field for sports and experiences, though it remains to be seen how this will impact narrative film making.
We recently spoke to Koji Gardiner, Jaunt’s Director of Hardware Engineering, about their Jaunt ONE camera. This is a single unit that contains 24 camera mechanisms. After two years of development the unit is now being used for a variety of production applications, but you can only rent or lease the units.
In this interview Gardiner talks about the development of the camera, as well as the challenges in shooting and editing VR video.
Filmmaker: How was the camera developed?
Gardiner: When we started, we were using off-the-shelf cameras, and while that worked well to prove the concept, they were lacking in a number of areas. There are three key things that led us to what we have now. The cameras in those earlier systems were not synchronized, and without frame sync if you have movement across frames — either in a fast moving scene or in anything with a lot of detail — you will see a ton of errors that need to be cleaned up in post and it’s a very painful process. The cameras are fully synched in the Jaunt ONE; they’re frame locked and there’s global shutter, so any movement is perfectly locked between all the cameras.
The second was the image quality you get from off-the-shelf cameras. In low-light conditions you see a lot of noise and graininess. The narrow dynamic range results in areas that are blown out when, for example, you’re indoors and pointing the camera out a brightly lit window. You also have very little control over things like exposure and white balance. The image sensor that we are using is about four times the array size of a GoPro sensor, so we get much better low-light performance, much better dynamic range.
We didn’t realize this at first, but low-light performance is important for VR. When you have a camera that looks in all directions, lighting the scene becomes very difficult. In traditional filmmaking you can have lights behind the viewpoint of the lens, but when you have a VR scene you can’t hide lighting the way you can in traditional movies. You end up having to shoot a lot of scenes in ambient lighting and a large sensor is really important for that.
Finally, how you manage the data and how the files are named is something that you only think about when you’re sucking a ton of data off these cameras.
Filmmaker: What’s most important, sensitivity, resolution or dynamic range?
Gardiner: The big benefits we get are in low-light sensitivity and dynamic range. The resolution is also improved, but it’s not as critical at this point in time, because the resolution on the viewing devices for VR content is the limiting factor. The Oculus Rift, the HTC Vive, and any of the hand held phones that can be used as a VR device right now, those are the bottleneck in terms of resolution.
Filmmaker: What stage are you at in development?
Gardiner: We’ve built the first units and are working with our partners to film content with those, and we’re also beginning to ramp up production of the camera.
Filmmaker: If I want to make a movie, what are the steps?
Gardiner: Jaunt Studios is a branch of the company that acts as a resource for people who want to create cinematic virtual reality content. That would be the arm of the company that you would contact if you wanted to create content.
They would provide the camera as well as access to our cloud rendering pipeline which takes the 24 individual video files from the camera, stitches it all together and provides you with fully rendered output that you can edit however you want.
Filmmaker: How do you edit the content?
Gardiner: We’re supporting a number of standard film-industry tools. Internally we use tools like Final Cut Pro and Adobe Premiere and After Effects and Nuke for compositing, so the output from our cloud pipeline is something that you can easily integrate into professional workflows.
The output of the camera is 24 files, but we upload them as a set to our cloud-rendering pipeline and those 24 videos are merged into a single output that has a left-eye and a right-eye panoramic video. From there you get the combined rendered output and you can edit that. You can open it up and start playing with it. The output is essentially a ProRes video file that can be read by any video editing software.
Filmmaker: The output file contains a spherical image in the frame. When you look at the image in an editing application it must be hard to figure out what you are looking at.
Gardiner: That is kind of tricky. We are working on tools that make it easier to edit and view in real time what’s going on. There is some interesting work being done in viewing in a VR device like the Oculus while you are doing editing so you can actually see it in a fully spherical environment as opposed to on a 2D display. These are things that are being worked on by the industry in general.
Filmmaker: What is the frame size of that video?
Gardiner: With the current camera we could output up to 8K, but we output at 4K currently due to codec limitations.
Filmmaker: How are you mounting and moving the camera?
Gardiner: We have a set of standard tripod mount points on the bottom of the camera, so you can mount it to any tripod you want.
In terms of movement, one of the things we found early on was that if you’re not very careful with how you move the camera, you can make the user nauseous very quickly, because there’s a disconnect between your brain knowing your body is static and seeing the entire world around you moving. Whenever we do movement we make sure that the camera is very well stabilized. We’ve done some drone shoots that can really be amazing but for the most part we’re doing static shots right now.
Filmmaker: When operating the camera, do you wire it to a monitor? Are you able to see what’s coming out of it?
Gardiner: There are two different modes for running the camera. The first is the simple plug-n-play mode; you set the camera up on a tripod, connect one cable for power, and then you hit the record button and you walk away and it’s recording.
Alternatively, there’s a USB port on the camera that you can use to connect to a laptop and we have a companion application that allows you to preview what the camera is seeing. You can do things like check the status of the camera to see if storage is filling up or the camera is overheating.
Filmmaker: Does it have an internal battery?
Gardiner: Power is external, but we support any of the standard film industry batteries, Anton Bauer batteries are what we typically use.
Filmmaker: How long a sequence can you record?
Gardiner: It really depends on what size of SD storage you are using. With a 64GB card you can get over 2 hours of recording, but it you have larger cards it’s essentially limitless.
Filmmaker: The unit has many cameras around the center, but only four top and bottom. Why is that?
Gardiner: There are 16 cameras on the equator, and 4 up and 4 down. We get a complete 360 view and the reason we have more around the equator is with that amount of overlap between the different fields of view we get much better stereoacuity. Top and bottom, stereo is somewhat arbitrary, so we have enough data there to fill in the sphere but as you look up the world kind of fades to mono at the very top.
Filmmaker: What do most people do about the tripod at the bottom in the image?
Gardiner: In most of our content we just have a small placeholder that’s our logo or the logo of whoever is creating the content. In a scene where it’s a patternless background you can paint it out in post, so there’s a number of things you can do.
Filmmaker: What are the distribution options for VR?
Gardiner: Today, the widely available platform is YouTube 360, which is currently monoscopic. There’s also Facebook 360 and they support embedded 360’s as well. Those are the widely used environments that we can distribute now, but they are unfortunately monoscopic.
We also have our apps that you can run on your PC for the Occulus and other tethered headsets. There’s also a number of VR accessories for smart phones that are starting to become available if you have one of these fairly inexpensive accessories for your phone it’s basically a phone case that has a couple of lenses in front of it you’ve immediately got a VR viewer that you can use to watch any of our content.
Filmmaker: What sort of applications are the cameras being used for?
Gardiner: There’s a number of different ones. Music is a huge one, and it’s one that we targeted initially. One of our first pieces of content was the Paul McCartney concert at Candlestick Park. That experience is very immersive and it immediately shows you the power of this viewing platform because you really feel like you’re on stage with the band.
Another industry is sports. We’re having a lot of discussions with big players in the sports industry about filming, not only games but also trying to film behind the scenes and narrative content that shows you the players in a way that you can’t see normally.
Finally we are working with a number of directors and creatives in Hollywood who have a lot of interesting ideas about creating purely narrative and story driven content. That’s really exciting because it’s a brand new medium and the language for how you tell stories in this medium is unwritten right now.
Filmmaker: What about cutting between one shot and another in the same location?
Gardiner: Just to go back to that Paul McCartney concert, we had three cameras on stage and when we created the final edit, we had a few cuts between the locations on the stage and it works very well, but as a viewer if you’re not expecting that cut it can feel a little bit strange.
We think it’s something similar to the early days of cinema when people were confused by cuts and different viewpoints. It’s something that I think people will start to understand and feel comfortable with, but these are the sorts of things that we’re working with creatives to understand.
See also:
Jaunt: Jaunt Studios and RYOT Announce Production Deal And Release First Two Episodes Of Virtual Reality Documentary Series
ABC News: Take a Virtual Reality Tour of New York City During the Most Wonderful Time of the Year