In Profile: Unreal Engine

Having made serious inroads into the live events industry throughout 2020, Unreal Engine’s Ben Lumsden and Patrick Wambold have a bold vision for the company’s future in live entertainment. TPi’s Stew Hume reports…

Prior to COVID-19 and the hiatus on live touring, Unreal Engine was already a name that TPi were very familiar with, although usually only in the context of the real-time rendering options the system offered for content creators on live tours. However, since March last year, Epic Games’ technology has been mentioned in our pages for a number of other functions. From its use within the world of XR studios to fully virtual performances within Fortnite, Unreal Engine is making serious inroads into the live events sector. So, as we enter the second quarter of 2021, TPi thought it was only right to speak to some of those within the organisation that are actively involved in shaping what the future of live events may look like.

“Tim Sweeney, our Founder and CEO, has stated on the record that he wants to build the Metaverse,” began Ben Lumsden, Business Development Lead for Media and Entertainment, who joined TPi virtually from London, with Patrick Wambold, Solutions Architect, also calling in from the US.

“This is not just Tim’s wish, but it is our overall goal as a company. To get to a place where the virtual and real experience can mesh together seamlessly.”

With concerts and large gatherings becoming impossible for over 12 months, it is not so surprising that Unreal has really carved out a niche for creating online worlds to simulate the live experience. Early on in the pandemic, the company made headlines with Travis Scott’s in-game performance in Fortnite, then in summer last year, the company’s engine was used to recreate Tomorrowland.

Yet, while the platform has been at the heart of creating spaces for fans to continue to enjoy live music, these temporary alternatives to live events are just the start of the company’s plan for the future.

“For a long time, before COVID-19, we’d already been developing the tools and solutions for this sector,” explained Lumsden, who noted DMX integration with Unreal as a prime example. “We knew if we were to be embraced by the events sector, we needed to make the system operational with a lighting console so that the fixtures that exist in the virtual world would operate in the same way as they would in a rig.”

In Unreal Engine 4.25, the company introduced its initial support for connecting the engine to external controllers and devices that use the DMX protocol. Since the release, Epic has been working with a number of end users including multimedia studio Moment Factory to further improve Unreal’s new DMX plug-in.

The company has also been working with a number of other parties to push the DMX integration, such as James Simpson of Copper Candle. The company’s plug-in provides creative manipulation of objects and actors via a DMX lighting desk.

“With our protocol via DMX we have been able to control 70 objects within Unreal Engine,” explained Simpson in a separate phone call.

“There is no reason you couldn’t be operating thousands of objects. This is so important because an entire industry, currently out of work but who understand the process of programming a lighting desk and cuing, can now put their skills into Unreal without even having to know how to use the Engine.”

“Along with looking for a tool for end users, more recently we have also begun to employ people like Patrick [Wambold], who have an events background to aid Unreal as we move into this sector,” stated Lumsden. Wambold had been working in the live events and broadcast industry for the past 15 years, having worn a number of hats including automated lighting programmer, media server specialist, production manager, and content producer, and is one of a number of new faces at Epic to help bridge the gap.

“One of our main focuses when it comes to live events and Unreal is to make everything very familiar,” began Wambold, describing how he and the team were attempting to bridge the gap between live production crews who, especially now, are working in a virtual environment.

“COVID-19 hit so fast and suddenly many people in the industry were out of a job. The last thing we at Epic would want to do is now ask someone to become an expert in our system because, frankly, that is a huge task. However, we can provide the tool that makes it a familiar environment for LDs, where they are able to manipulate a digital rig in the same way they would in real life.”

What was clear however during our chat with team Unreal, was that they foresee the system being much more integrated into the live events environment than just a one-off tool to create a virtual concert. “What’s always perplexed me is when a tour will go through three or more stages in the design process using different software tools to reach the final vision. The design created for the pitch concept is often discarded and redrawn for previs in CAD, Vectorworks, or Wysiwig, which then has to be converted when additional versions are built for media servers, Unreal, etc. That’s too much unnecessary duplication of work,” stated Wambold. “What I propose is to do your previs work using Unreal so that the content for the show can live on in a number of different formats.”

 

View this post on Instagram

 

A post shared by TPi (@tpimagazine)

Last year, Creative Works used Unreal Engine to create a digital music video for The Dead Daisies’ Bustle And Flow, took the assets from the video and created a bespoke first-person shooter called Daisy’s Revenge.

“I’d been looking at real-time rendering for a long time with the idea that it could really improve the service to our clients,” stated Dan Potter of Creative Works. “Working with bands like Guns N’ Roses in the arena setting helps to improve our knowledge of Unreal’s real-time rendering. From there we have really specialised in creating the hyper visuals that can then be reused in any format.”

Potter went on to explain how doing all the foundation work of visual creation made the move into the gaming world for The Dead Daisies project seamless. “We were still managing the visuals but just collaborating with different artists to help us execute the ideas for the game,” he stated

Lumsden added that using Unreal as a previs tool also meant that artist and management would be able to jump into the virtual space and replicate the design of an upcoming tour to give final notes and suggest tweaks without the need for a powerful computer or expensive previs software and changes could be made from the designs in real time.

“We already have a number of clients from the EDM world especially who have embraced this workflow,” he enthused.

Prior to this point in our chat, many of the examples the Unreal team had discussed were projects that existed purely in a digital realm, but Lumsden explained how some Unreal users had already begun to embrace a hybrid style performance – specifically the Royal Shakespeare Company’s latest adaptation of A Midsummer Night’s Dream titled simply, Dream.

The online event had actors in a state-of-the-art studio having their movements captured, which were then translated into the virtual world, watched by an enraptured global audience.

To close, TPi asked both Wambold and Lumsden, what role will Unreal Engine play within the events sphere in the next 12 months? “One recurring comment I always hear from clients is there is never enough time when building a show,” mused Wambold. “I think Unreal’s ability to work collaboratively on a virtual session, in real time is almost the holy grail of show design.”

Agreeing with Wambold, Lumsden furthered how Unreal’s roadmap into this space means, “things will only get better”. He explained how collaborations with the likes of disguise have helped secure Unreal’s place in the market.

disguise recently announced new features for its Extended Reality (xR) software, an upgrade which enhances its integration of Unreal Engine and addressed the continued demand for immersive content of the highest quality, detail and frame rate across xR and Virtual Production.

All we can say from TPi’s perspective is that it’s exciting to see how Unreal Engine’s wealth of knowledge from the gaming world will manifest in the performances of the future – both live and virtual. Game on!

This article originally appeared in issue #260 of TPi, which you can read here.

www.unrealengine.com