When internationally-renowned composer Hans Zimmer was looking to create the visual elements of his world tour, it was clear that the production required a brand new approach. Zimmer called upon award-winning theatrical projection designer Peter Nigrini to create something unique for the tour, and asked events specialist Pixway to help them bring it to life with disguise-powered technology.
Alongside his career as perhaps the most prolific and recognised film composer working today, Hans Zimmer is also known for his immensely popular global tours, which offer an accessible way for music fans to enjoy his compositions performed by a full orchestra. For his 2022 European tour spanning 24 cities, Zimmer was interested in creating an experience for his audiences that had a distinctly theatrical sensibility.
“I think the reason why Hans was drawn to work with stage designer Derek McLane, lighting designer John Featherstone, and me, was that he recognised the need to build his audience a visual world,” said Nigrini. “Providing each piece of music with a visual identity was first and foremost, but we also needed to meet this challenge while integrating live cameras to serve the audiences in the arena sized venues.”
Nigrini was completely won over by the possibilities disguise’s tech opened up to his production. “Anytime someone gives me a chance to do something new; when someone says they haven’t attempted something before, that sounds like a reason to charge ahead. I’ll try anything if I know I have partners like those at Pixway and disguise to bring my vision to life.”
Creating the look of the production, Nigrini was adamant that live video footage should look nothing like those seen at other touring concerts. He worked with Berlin-based creative studio Pixway, who are specialists in the disguise workflow, to find a solution that would differentiate their show from others. One of the ideas that the team settled on involved processing and treating all of the live cameras outputs so that footage for different suites felt aesthetically linked to the films the music originated from.
Part of this entailed colour grading the footage live as it was being shot, as well as other Notch processing built by Emery Martin. This allowed for the live camera footage for each track to be harmonised with the colour palette of their respective films, so that audiences could subconsciously link each piece to films they knew and recognised.
It was this process that led to Pixway, who have a long history of partnering with disguise on a wide range of projects from live and corporate events to xR production, to suggest using disguise’s RenderStream workflow powered by two disguise rx real-time render nodes and two disguise vx 4 media servers to bring the vision to life.
With disguise’s RenderStream workflow only previously used for in-studio filming and broadcast, Pixway knew that this was a bold choice to use it in a live concert setting – let alone a worldwide arena tour. The risk paid off for the team, delivering an amazing experience.
“There were multiple ways we could have achieved the solution we needed for these cameras,” said Nigrini. “RenderStream is a tremendous asset at this level of production. And because we had expert help from Pixway and disguise, we could look at the half dozen different ways to get there and pick the most appropriate one for the problem in front of us.”
The power and reliability of disguise’s tech was as important to keeping the workflow simple as RenderStream itself, says Nevil Jeremias of Pixway. The team needed minimal equipment to run the entire show. They were able to use disguise vx and rx machines to power all of the live processing and carry the show.
“Fewer machines are much more efficient,” said Jeremias. They allow for more space behind the stage, a less complex setup, and other convenient elements for touring. Being able to use less machines just makes the setup more reliable, essentially.”
“Being able to offload the Notch rendering to the rx machines kept enough processing power on the vx machines for other parts of the show, such as automation and video playback. Workflow wise the processed Notch IMAGs could be used as easily as the other video inputs which sped up programming immensely. With this technology we could send all 14 cameras to the render node to process them in packs of four. Texture sharing is extremely fast and only adds a few frames to the roundtrip so it integrates seamlessly into existing environments,” Jeremias added.
The team also used the project to beta test the latest disguise r21 software release, launched earlier this year. The release brought significant improvements to the RenderStream UI and new tools for colour management, following feedback and disguise’s close collaboration with its beta testers and Live and Colour Insider groups.