Fremont, CA – Blackmagic Design today announced that the newest virtual reality (VR) film from filmmaker Jonathan Griffith, featuring world renowned rock climber Alex Honnold (“Free Solo”), was edited in DaVinci Resolve Studio editing, color grading, visual effects (VFX) and audio post production software. Additionally, 2D and 3D compositing, VR and motion graphics software Fusion Studio was used to process the multi camera, stereoscopic VR imagery.
The two part VR experience “Alex Honnold: The Soloist VR,” which premiered on March 3, follows the California native on solo ascents across two continents, documenting free ascent climbs in some of the most remote and wildest locations possible. The film, from Jonathan Griffith Productions, is available on the Oculus TV app via all Meta Quest VR headsets.
After his successful film “Everest VR,” Griffith was eager to make a film about solo climbing, or soloing, and Honnold was the most logical person to approach. However, convincing the renowned climber to join the VR world was the first step. “Part of the challenge in VR is convincing people that really good VR is a really, really amazing experience,” said Griffith. “But I knew that as long as I got Alex to watch ‘Everest VR’ that I would totally have him. So, I sent him a headset, he watched ‘Everest,’ and he was completely hooked by the possibilities of creating an amazing soloing VR experience.”
The challenge of producing a high quality VR climbing experience is more than finding the right person to feature. Complex technical planning was required to allow Griffith to capture the adventure as realistically as possible without a large crew or extensive gear. “When you're shooting traditional media, you have the classic climbing shot from above the climber looking down, and the climber’s roped in with a big drop underneath them,” said Griffith. “This doesn’t work in VR. We need to shoot a 360 degree panorama around our subject, so we have to get the camera out from the cliff as far as we possibly can. Our type of production doesn’t have huge rigging systems; it’s just me with what I can carry. I have to climb these mountains as well, and I have to put the rigging system together myself. It’s very challenging.”
Griffith used a VR camera with eight lenses designed to capture a 360 degree view, which produced eight clips at 3840 x 2880 resolution each. These angles were then stitched together to create a full 360 degree, stereoscopic image. Editor Matt DeJohn used DaVinci Resolve Studio, creating a timeline at 7680 x 7680 resolution with the added complexity of stereoscopic left and right eye in top/bottom orientation. Once setup was complete, including syncing spatial audio and mic tracks, the editorial process could begin.
Constant considerations for VR made the process anything but normal, and the ability to access Fusion functionality within the DaVinci Resolve Studio edit page was critical. “Orientation is very important in VR, and often needs to be adjusted on a per shot basis,” said DeJohn. “The main goal is to put the subject of the shot in front of where the viewer is likely to be looking. For instance, in shot A, if a character walks from in front of the viewer to 90 degrees on the right, shot B's subject should be positioned at that same point, 90 degrees off center, on the right as well. This orientation adjustment was handled using a Fusion FX template I created that I could access from Resolve’s edit page. This template worked like a custom made plugin.”
For DeJohn, the flexibility of DaVinci Resolve Studio made the complex workflow much more manageable. “Working only in Resolve for most of post was ideal because it minimized the number of hand offs,” added DeJohn. “Resolve has a huge amount of capability across disciplines: editing, compositing, visual effects, color grading and sound. I could make editorial adjustments even while the grade was happening. I would never be able to do that using an older style editing solution. I was still tweaking cuts, adjusting visual effects, making VR changes, and working on stereoscopic conversion while other artists were working on other areas, and none of us ever had to leave the same timeline.”
Stitching the eight camera feeds together presented its own difficulties, which Fusion Studio was able to handle. VR Post Production VFX artists James Donald and Keith Kolod processed the base stitch and then brought the elements into Fusion Studio for cleanup and stereo work. “First, I went through and fixed any stereo vertical misalignments and stereo depth issues using grid warps and Fusion's disparity stereo alignment tools,” said Kolod.
He and Donald found Fusion Studio’s capabilities in visualizing stereo for the artist to be a welcomed bonus. “Having an interlaced stereo monitor really made this process go faster, since Fusion supports interlaced stereo viewing, which meant I didn't have to jump back and forth into a headset while working. Fusion also supports headset viewing, which allowed for a quick stereo alignment review before moving on to the next task,” added Kolod.
Donald also found the stereo functionality in Fusion Studio valuable. “Fusion’s stereo aware tools, in combination with its VR tools, offered a powerful set of features out of the box that were more stable and reliable than other packages, not to mention great value for price for individual artists,” said Donald. “In particular I relied heavily on the Oculus VR headset integration for real time previews when manipulating splines and masks in stereo, as well as finessing stereo alignments and paint work.”
While both Donald and Kolod worked extensively in Fusion Studio, Donald appreciated that Fusion was also a component of DaVinci Resolve Studio. “I preferred the view customization that Fusion Studio offered over the Fusion page in Resolve, but I also turned to Resolve when I needed the full power of GPU accelerated Resolve FX, which were an indispensable part of my time lapse workflow. I also found the built in revival FX tools de flicker and dead pixel fixer helpful and fast in Resolve.”
Cleanup in VR footage can often be considerable, but Fusion Studio provided the right tools to manage the work. “I painted out tripod rigging and wires with the paint node and then converted that work from one eye into the other to match the natural stereo in the area,” said Kolod. “Then I ramped down any extreme stereo depth in the Nadir or Zenith pole areas, which can put major strain on the viewers eyes in headsets. After that, I did more traditional rotoscope and clean plate cleanup work around any subjects that still crossed through the 360 cameras seams.”
With such extensive levels of work, DeJohn and Griffith were pleased that DaVinci Resolve Studio never made them commit to finalizing any stage until all stages were complete. “Beyond how it expanded the scope of what I could personally handle, the collaborative nature of Resolve also opened up the timing of various processes,” said DeJohn. “For instance, we had a specific date scheduled for color correction, but we were still waiting on some final stitches, last clearance approvals, title cards and credits. Traditionally, I would probably need to delay color correction, but because the colorist could color correct directly on my active edit timeline, I could keep the color grading date and have him complete his work before I was even done. Then, his color correction can live on with my edit timeline, and when I get my last VFX, stitching and title cards, I can just drop those in where needed and with a button, deliver.”
For Griffith the key to any post production tool is having the ability to see a unique VR experience throughout post, rather than just at the end. “I love that I could see the film growing at every stage, as opposed to having to imagine things because the edit hadn’t locked or the color hadn’t been touched yet,” said Griffith. “Resolve is so capable that I can see a project as it is being built, rather than wait for screenings to view the film only to discover we have to go back and change something critical.”
Despite the technology, what Griffith hopes to achieve in the end is reality through the experience of a VR headset. “I love VR because it's so real, raw and honest, and it’s very hard to fake things,” concluded Griffith. “I can't tilt the camera to make the climb look steeper. I can’t over saturate a sky for one angle because it won’t feel real. VR is about being present, not about fancy shots or quick cuts. You can’t fake being there, you have to build that into the film. I love it because of that. If you want to go shoot a really cool solo climbing film, you actually have to go do some really cool solo climbing and find a way to capture it. There's no way around that.”