There’s new Resolve magic in the air

Blackmagic Design keeps pace with new industry developments.

Karen Moltenbrey

Blackmagic Design has a lot of Resolve. On the AI side, the company introduced the DaVinci Neural Engine into its DaVinci Resolve 19 beta release three months ago, bringing AI capabilities to its video editing software. Now, the firm has announced it is fine-tuning the software to support Qualcomm’s Snapdragon X Elite processor for greater performance increases. On the AR side, Blackmagic Design has teamed with Apple on an end-to-end workflow that will enable professionals to edit content for the Apple Vision Pro spatial computing device/headset.

Blackmagic Design UI
(Source: Blackmagic Design)

For the most part, technology evolves at a very fast pace, but with generative AI, it is developing faster than fast. From processors, to systems, to apps, companies and users are working hard just trying to keep up with new developments.

Just a scant three weeks after announcing its DaVinci Resolve 19 beta 3 release, Blackmagic Design fine-tuned the video editing software to support Qualcomm’s Snapdragon X Elite. An all-in-one integrated Qualcomm Oryon CPU, Qualcomm Adreno GPU, and Qualcomm Hexagon NPU processor, the Snapdragon X Elite received a lot of attention at the Microsoft Copilot+ debut in late May. Blackmagic Design says the Snapdragon X Elite boosts the performance speed of its DaVinci Neural Engine AI by up to 4.7× on Windows computers. It also claims a 2× performance increase for smart reframe on computers using the Snapdragon processor.

Snapdragon processor
(Source: Qualcomm)

Blackmagic Design released the first public beta version of Resolve 19 in April, which included the company’s new DaVinci Neural Engine AI tools and more than 100 feature upgrades, which are all fully supported by the Snapdragon X Elite processor.

DaVinci Resolve 19 public beta 3 supporting Snapdragon X Elite (for Windows 11 for Arm) is available for download now from the Blackmagic Design website.

While GenAI is getting most of the attention nowadays in the tech world, there are still developments happening in the AR realm. Last week, Apple Intelligence, Apple’s brand-new AI solution, stole the spotlight. However, Apple did reserve a few minutes at its Worldwide Developers Conference (WWDC) to provide a rundown on all its products, from Apple TV to iPads, iPhones, Macs, and more. That includes updates pertaining to its Vision Pro mixed reality headset/spatial computing device, which was released four months ago.

At launch, Apple introduced Immersive Video, an entertainment format just for Vision Pro that leverages 8K, 3D video and 180-degree FOV, along with Spatial Audio. Users can now create their own Immersive Videos, thanks to a partnership with Blackmagic Design resulting in an end-to-end production workflow comprising a new Blackmagic Ursa Cine Immersive camera, updates to DaVinci Resolve Studio, and Apple Compressor, giving professional filmmakers the ability to edit Apple Immersive Video for the first time.

Apple’s Vision Pro.
Blackmagic Design’s Ursa Cine Immersive camera, part of the platform for creating video on Apple’s Vision Pro. (Source: Blackmagic Design)
Blackmagic Design’s Ursa Cine Immersive camera, part of the platform for creating video on Apple’s Vision Pro. (Source: Blackmagic Design)

According to Grant Petty, Blackmagic Design CEO, Ursa Cine Immersive is built on the new Ursa Cine platform and features a fixed, custom, stereoscopic 3D lens system with dual 8K image sensors that can capture 16 stops of dynamic range. “With this innovative system, filmmakers can record remarkable moments like action-packed scenes, unique perspectives, stunning landscapes, intimate performances, and more, all with incredible fidelity, offering viewers an unparalleled sense of realism and immersion,” he says.

Blackmagic Ursa Cine Immersive’s lens system is pre-installed on the camera body, designed specifically for Apple Immersive Video. The sensor delivers 8160×7200 resolution per eye with pixel-level synchronization. Cinematographers can shoot 90 FPS of stereoscopic 3D immersive cinema content to a single file, according to Blackmagic Design. The company noted that the custom lens system is designed specifically for Ursa Cine’s large-format image sensor, with extremely accurate positional data that’s read and stored at time it’s generated. This immersive lens data—which is mapped, calibrated, and stored per eye—then travels through postproduction in the Blackmagic RAW file.

Apple Immersive Video shot on the Ursa Cine Immersive camera can be edited with DaVinci Resolve. A new immersive video viewer will let editors pan, tilt, and roll clips for viewing on 2D monitors or on the Apple Vision Pro. Transitions rendered by Apple Vision Pro can be bypassed using FCP XML metadata for clean master files. Export presets will enable quick output into a package that can be viewed directly on Apple Vision Pro.

Blackmagic Ursa Cine Immersive comes with 8TB of high-performance network storage built in. It records directly to the included Blackmagic Media Module and can be synced to Blackmagic Cloud and DaVinci Resolve media bins in real time.

Blackmagic Ursa Cine Immersive and the new version of DaVinci Resolve that supports Apple Immersive Video for Apple Vision Pro will be released later this year.