A game engine alternative to virtual production

Chaos’ Project Arena brings real-time ray tracing and original production data to virtual production.

Karen Moltenbrey

Chaos Innovation Lab has unveiled Project Arena, which eliminates the need for game engines, which have been a mainstay in virtual production. Project Arena, which leverages the Chaos Vantage renderer, brings real-time ray tracing to virtual production. The toolset enables artists to move their 3D assets and animations directly from their DCC software, including 3ds Max, Maya, and Houdini—all without requiring some type of conversion process before it is placed on an LED screen. Project Arena is still in development and will undergo real-world testing by independent users this year before it becomes more widely available in 2025.

What do we think? Chaos has been exploring virtual production for a while now. In 2015, the company worked on a project called “Construct” with Kevin Margo, which was all rendered on the GPU. For a production-quality short film, this was unheard of at the time, as they brought real-time ray tracing into a motion-capture setting—the company’s first attempt at real-time ray tracing. Of course, at the time, GPU hardware was not as advanced as it is today. And virtual production was in its earlier stages.

Game engines certainly changed the game for virtual production, providing that much-needed aspect of real-time feedback by linking the physical and real cameras. With game engines, though, the CG content requires some type of pre-processing such as baking, decimation, etc. before it is used on the LED screen. The breakthrough here is that Arena uses a pure ray tracer, so original data can be used directly on the screen during virtual production—a game-changer, one would say, that maintains data integrity and enables artists to work with a pipeline they are familiar with. There are advantages to this workflow, and Chaos appears determined to work out the kinks and bring this to market.

Chaos Project Arena: Streamlining virtual production

Virtual production and game engines always have been inherently linked. Virtual production blends traditional filmmaking techniques with digital technology—uniting the real, physical world with the digital world—by leveraging real-time technology. For years, this has been done using game engines. However, Chaos Innovation Lab has devised a solution that will eliminate the need for game engines in virtual production.

Chaos Labs
Chaos Innovation Lab tests its Project Arena on a virtual set. (Source: Chaos)

Code-named Project Arena, the toolset provides a faster, less expensive, and more efficient alternative to game engines, says Chaos. This is because studios can go directly from their DCC tools of choice to the virtual production stage without any processing, baking, or simplification of the data. Arena works with a facility’s camera tracking system to coordinate the camera within the virtual scene to match the physical camera photographing the actors. The resulting camera view is rendered in real time across machines that send their output to the LED walls.

According to Chaos, with Project Arena, artists can move Chaos V-Ray assets and animations to LED walls in approximately 10 minutes and access production-quality ray tracing using pipelines they are familiar with.

“Game engines helped kick-start a revolution, but many in the  VFX industry still can’t access it,” says James Blevins, co-founder of MESH and former postproduction supervisor of The Mandalorian, which spotlighted virtual production. “Project Arena takes an essential part of the VFX toolkit, ray tracing, and makes it available in a virtual production volume, straight from Maya, Houdini, or 3ds Max. No faking, no baking—just something that puts an artist’s work directly on the wall.”

Chaos began working on Project Arena about two years ago, when they saw the solution pieces coming together for them, says Phillip Miller, VP of product management at Chaos.

Arena works with a facility’s camera tracking system to coordinate the camera within the virtual scene to match the physical camera photographing the actors. The resulting camera view is rendered in real time across machines that send their output to the LED walls.

Arena leverages a modified version of the Chaos Vantage renderer to power the LED walls of a virtual production stage. (Vantage lets users explore and present V-Ray scenes in real time within a 100% ray-traced environment just by dragging and dropping or live-linking V-Ray scenes.) Since Vantage supports V-Ray-authored content, any creation tool, or combination of tools, supported by V-Ray can be used to deliver content to Arena. Unlike game engine approaches for virtual production, Arena works directly with the actual 3D creation tools and their content, even at film production standards, without any special consideration for making it real time.

“The same artists making VFX can use their same tools, knowledge, and assets to reach virtual production,” Miller says. “Arena’s support for massive geometry (trillions of triangles) and high-fidelity conversion of V-Ray materials mean that virtual production can use the same pipeline that is producing final-frame production rendering, making virtual production far less costly.”

Miller points out that in using the Vantage renderer, Arena’s output is 100% path-traced, delivering effects such as exact reflections and global illumination dynamically with no need for any baking, greatly increasing the realism of what the LED walls display. In using a pure ray tracer, Arena also is not hindered by geometric complexity, so original data can be used in virtual production.

“Unlike a game engine, what you see in the likes of Maya or Houdini is what you get on your LED volume,” says Miller.

Chaos labs
(Left) Mihail Sergeev, chief technology officer, and (right) Vlado Koylazov, head of innovation, discuss Chaos’ Project Arena.

To move 3D scenes from the creation tools onto LED screens, Nvidia RTX-class GPUs are required. Most facilities use two top-end GPUs per machine to deliver the maximum performance with the most memory, notes Miller. In addition to the RTX GPUs, Arena also uses Nvidia Quadro Sync cards. Arena further uses DXR for its rendering API and employs DLSS with ray reconstruction for denoising.

According to Chaos, Project Arena has proven itself at handling large amounts of geometry. Blevins is part of a group that is currently putting Project Arena through its paces on a short film produced by Chaos. So far, they were able to run a quarter of a trillion polygons at 60 FPS on a single GPU. Chaos says it hopes to improve that performance with the addition of more shader types.

Richard Crudo, cinematographer on the short film, says Project Arena enables cinematographers to do their job more creatively, quickly, and efficiently. He adds that the toolkit delivers a more precise method of accomplishing what, up to this point, has been a generally cumbersome task.

Chaos is in the process of fine-tuning the Project Arena workflow and adding more camera tracking protocols. The company is working with early adopters this year on real-world productions and plans to make Project Arena more widely available in 2025.