Behind The World’s First XR Jazz Show

Animatrik Founder, Brett Ineson reveals how his team at Shocap Entertainment harness mocap and XR technology to produce a unique virtual performance.

Vancouver-based Animatrik is one of the largest dedicated performance capture and virtual production studios in the world – facilitating the creation of motion capture shoots for huge video games, such as Gears of War, and blockbuster movies, such as Avengers: End Game. Over the past year, the studio has pivoted towards virtual production and live virtual performances – with Founder, Brett Ineson co-creating Shocap Entertainment to produce original XR music shows, including the world’s first XR jazz show on BBC Click.

Produced by Shocap Entertainment, the unique virtual performance saw famed jazz singer, Jill Barber perform a Christmas setlist in an entirely virtual Palomar Supper club with a digital band. The venue was digitally resurrected in CG to house the band and facilitate a transatlantic interview between BBC presenter, Paul Carter and Barber.

Ineson oversaw the deployment of mocap for the band – this involved placing motion sensors on instruments, reordering the band setup, synching motion data with audio recordings and CG, and finer details, such as capturing drums with cameras while avoiding the reflective sheen. This data then needed to be synched and presented live, ensuring the musicians were comfortable and able to perform properly.

Animatrik’s Vancouver studio was set up with 70 Optitrack cameras by Natural Point to capture the musicians who were all wearing motion capture suits. Unreal Engine programming was harnessed to create a queuing system, managed by Shocap Entertainment’s Athomas Goldberg. The show required full motion capture performers and simulcam XR as well as Ncam technology for the simulcam.

“We used GIANT software to retarget the characters in real-time. There were also several static cameras recording the performance alongside one closeup camera using NCAM to track all of Jill’s movements,” Ineson said, explaining how accurate compositing was applied to synchronise Jill Barber’s image with the virtual plate at all times during moving camera shots.

Because Barber was recorded live and the musicians were rendered as CG assets, the team could offset the musicians in the space and set things up in a way that would be counterintuitive to any natural live setup. “The musicians were placed on the other side of the room facing Jill, rather than behind, as their CG assets appear in the CG performance,” Ineson said, highlighting that this made it far easier for Jill to be keyed out against the green screen in the performance. “It had a unique effect on the dynamics of the performance – the band could take direct cues from her in ways that would have been impossible if the musicians were physically positioned behind.”

The instruments also had to be motion tracked, with special considerations made when tracking the drum kit in particular. “The symbols, for instance, have a strong reflection, which can potentially disrupt the motion capture cameras’ ability to track the movement, so we needed to mask the particularly reflective parts,” Ineson said, referencing marker positioning as key so as not to significantly change the sound of the kit during recording. “Recording visuals, sound and motion capture simultaneously requires extra consideration,” he remarked.

Despite the success of the venture, Ineson believes that while livestreaming has its place for the here and now, performers will still desire an in-person audience when it is safe to do so. The technological adaptations using mixed reality and livestreaming are definitely here to stay – they will likely expand what’s possible, rather than replace the live experience every musician and fan knows and loves,” Ineson theorised. “It’s about finding the way to bridge the gap – there’s the potential to have virtual hybrids, using the technology to enhance the in-person experience while providing remote experiences simultaneously.”

THE GAMIFICATION OF LIVE EVENTS

Notwithstanding the COVID-19 pandemic, Animatrik’s work is and always has been live – the only difference is the presence of an in-person audience. “Real-time visualisation is an important tool in the production of films and games. In those cases, the audience is only around 50 behind-the-scenes people at a time. Live events are business as usual in many ways, the cost of failure is infinitely greater with a large consumer audience on the other side though,” Ineson said, recalling the event sector’s newfound focus on the same real-time technology used for video games and movies now applied to live and virtual experiences. “It’s a trend that was definitely growing ahead of the COVID-19 pandemic but has since exploded in its use.”

Not necessarily married to any brand name or piece of kit, Animatrik constantly seeks and evaluates new and emerging technologies. At the time of writing, Optitrack cameras and Lightstorm Entertainment software feature on both the team’s LA and Vancouver stages. “We have further developed our own technology to manage the production and utilisation of this hardware and software,” Ineson remarked.

This year, Animatrik added a large LED wall from Promosa – something that is commonplace for virtual production shoots in broadcast and live events, Ineson assures. “We hosted CBC’s NYE Countdown to 2021 performance with Canadian singer-songwriter, Lights. The artist performed on stage with Felix Cartel and generated a range of visual effects and lights using the LED wall behind her,” Ineson said, adding that the same wall was also used to generate live CG visuals for a mixed reality circus performance.

“A lot depends on how the entertainment industry takes shape in the coming months,” Ineson said, responding to whether he believes the onus on live music XR experiences will continue post-pandemic. “If there are further restrictions on live entertainment, there could be more opportunities to conduct virtual concerts.”

However, as things open, Ineson expects there’s a higher chance of hybrid models that take aspects from virtual setups and integrate them into in-person performances. For Animatrik, 2021 will see the continuation of online productions.

“There’s some pent-up demand that we are getting ready to engage. We have some more simulcam-style shows to develop, as well as many straight-up performance capture projects, across film, live performance, and video games,” Ineson concluded.

“It’s great to be back on set on a regular basis and we’re excited to share news on a lot more projects soon.”

This article originally appeared in issue #265 of TPi, which you can read here.

www.animatrik.com