Time to get hyped!

Esports and other virtual events have a loyal fan base, just as live events like the NBA or NFL have. To bring the excitement levels up a notch or two for fans and audiences within live events, sports, or esports broadcasts, producers are increasingly turning to real-time options to enhance their graphics and image content.

Karen Moltenbrey
(Source: Capacity Studios)


Esports and other virtual events have a loyal fan base, just as live events like the NBA or NFL have. To bring the excitement levels up a notch or two for fans and audiences within live events, sports, or esports broadcasts, producers are increasingly turning to real-time options to enhance their graphics and image content. And Epic is showing them how they can step up their game.

Through sample projects, Epic illustrates how creators of all kinds of projects are able to up the visual ante by using its technology. And seeing is believing, as Epic released the Hype Chamber, its most recent free downloadable sample that provides insight into designing, developing, and playing out real-time animation using Unreal Engine during live events. Based on a real-world workflow by video company Psyonix, this in-depth case study walks viewers through the entire process, from building motion graphics, animation, and a full virtual environment, to reimagining what a sports broadcast might look like.

Psyonix, which was acquired by Epic in 2019, needed a way to up its game for the annual Rocket League Championship Series (RLCS) esports tournament based on the developer’s popular vehicular-style soccer game. Psyonix focused on a real-time approach that aligned well with its experience in delivering real-time gaming action.

For the 10th season of RLCS in 2020, Psyonix radically altered the tournament format, moving away from league play format and bi-annual seasons to an open event-based circuit format that culminates in an annual championship. Not only was the tournament growing in scope and popularity each year, but so was the prize money. Psyonix believed the production value of the broadcasts had to grow as well.

“Working with Lamborghini, Ford, Verizon, Pele, X-Games, and others, we didn’t want to just slap a logo on the broadcast and call it a day,” said Cory Lanier, esports product manager at Psyonix. “We rebuilt each broadcast package for all of these shows and were running into workflow issues on creating new assets about every two weeks.”

With the new tournament format, there would be various sponsors and themes, and new teams would be competing weekly. Using a traditional pipeline, this would have meant a lot of additional work and rendering each time a new team was added. Not so with the real-time solution using Unreal Engine.

“With the way we’ve set up the Hype Chamber in Unreal Engine, we can quickly swap out logos and modify color palettes, and instantly have new high-quality assets ready to go for that weekend’s broadcast,” said Ellerey Gave, executive creative director at creative agency Capacity Studios, which along with esports company ESL Gaming collaborated with Psyonix and the Unreal Engine team on the new solution.

In fact anyone on the team, whether the person is a graphic designer or not, can go in, download the build, transition between over 1,000 combinations of assets, and create high-fidelity broadcasts. Prior to implantation of the real-time system, every graphic element had to be hard-baked, which was far less efficient, according to Lanier.

Start the game engine

The Hype Chamber utilizes several Unreal Engine features, from the Blueprint visual scripting system for creating gameplay elements from within the Unreal Editor to Datasmith for importing Maxon Cinema 4D design data files natively. Sequencer, Unreal Engine’s built-in nonlinear animation editor, aided in the design of each layout and animation, while the Remote Control API allows control applications to drive the content live during the broadcasts.

The Hype Chamber started out as a motion graphics package that was used to introduce the RLCS teams and matches. Soon, though, the Rocket League esports team wanted to build the Hype Chamber for real so teams could play their matches inside the virtual environment. This led to development of a physical studio space that uses real LED screens fed by outputs from the Unreal Engine scene to re-create the environment on stage. Teams are stationed on either side in front of a life-sized in-game Octane default vehicle body, which is often sporting a team skin, with the hallway extending out to the field in between.

The Hyper Chamber evolved from a virtual concept to one with a physical component. (Source: Capacity Studios)


The group then had to marry the virtual Hype Chamber created in Unreal Engine and the physical set. “Since we developed an entire scene that exists in 360 degrees, we were able to map the portions of the space we wanted to feature onto a set of LED screens, creating a dynamic backdrop for the live event, which already had much of the functionality for team customization built in,'' said Benji Thiem, creative director at Capacity Studios. “We further expanded on this package by including custom graphics, as well as a toolkit of video loops that could drive other smaller screens in the space.”


The side-by-side curved screens are skinned with their respective team colors, creating an environmental takeover of team branding. This element flows right into the start of the game, with the camera rotating 180 degrees around to fly out the tunnel that leads to the arena. 

Sponsor branding can be incorporated as well. “Rather than the typical sponsor logo sitting over a generic background, which is prevalent across all sports broadcasts, we’re able to flood the LED walls with sponsor colors and skin the iconic Rocket League Octane car with sponsor decals,” Gave said. “Or, when it’s an automotive sponsor, we’re able to swap the actual feature vehicles into the Hype Chamber scene, which, either way, creates an impactful co-branded moment without breaking the high-quality, immersive flow of the broadcast.”

What’s more, the team is able to scale content from on-air graphics, to video walls, to complex XR stages, all within one software package. The virtual Hype Chamber can be used outside the broadcast, too, such as to create immersive team spaces—a virtual showroom of sorts for team car decals available for purchase.  

Now others can take the RLCS setup for a test drive “and take a look under the hood at how everything works,” said Gave, who believes that real-time workflow is the future. “We’re hoping that seeing it in action will help demystify the process for others.”

(Source: Capacity Studios)