MIYAVI Virtual Level 5.0: Synthesis

Japanese rockstar, MIYAVI teams up with Amazon Music to embark on an exclusive free performance broadcast on the Amazon Music app. TPi’s Jacob Waite discovers how the creative team merges ground-breaking technology and artistry to create a unique virtual live performance series…

After premiering on 28 December 2020 on Amazon Music’s Twitch channel, MIYAVI Virtual Level 5.0: Synthesis was re-released on Amazon Music on 26 March, available to stream as individual clips from the show or as a playlist, similar to a visual album. An evolution of the traditional live format, MIYAVI and his creative production team are eager to continue exploring the ways that technology and music coexist to create what may become the future norm for live performances.

Driven by the opportunity to redefine live production around a mindset of sustainability, the creative production team pushed the bounds of collaboration, with almost all of the personnel involved working in tandem from their homes in the United States, Japan, and Italy. For MIYAVI, the show provided an opportunity to speak out on the biggest issues facing the world today.

He stated: “This year, the unprecedented spread of diseases has transformed the global economy we had become comfortable in – as a result, it has become a year for musicians to reconsider the message we convey in our music. Climate change, refugee issues, hunger, poverty, inequality, and pandemic: look around. The world is on fire. How can we deal with the global problems we face? How can we commit to the future of the planet through music and art? There may be a limit to what we can do, but we may find a new way of life by fusing with technology.”

MIYAVI worked in partnership with Director, Annie Stoll; Pyramid3 EP and Creative Director, Dylan Jong; Japanese production companies, Mothership Tokyo and CyberHuman, and American virtual production studio, Pyramid3. The creative concept took place within computer-generated sets developed in Unreal Engine by artist and technologist David Cihelna of Pyramid3, who previously created MIYAVI’s music video, Need For Speed.

Need for Speed was made entirely in a game engine, so fans expected to be immersed in similar surreal worlds during Virtual Level 5.0: Synthesis with MIYAVI’s signature energy as a performer,” Cihelna said, giving his character assessment of the artist. “MIYAVI can make any room come to life with his unique movements and guitar style. Getting fans together in a single space to watch and share their fandom is always an incredible experience.”

Cihelna handled virtual production and environment design – curating technical direction with the Japanese virtual studio team and directors. “I worked via the company I founded, Pyramid3, to deliver all the Unreal Engine environments, visual effects, brainwave EEG interpretation with our EEG artist and technical direction,” he clarified.

Having established the design studio at the height of lockdown in early 2020, Pyramid3 is designed to curate remote, international projects. “We’ve actually never been in the same room together. Because we use game engines, much of our production can be done remotely,” Cihelna noted. “The biggest impact has been the amount of time spent on Zoom or Google Meet – even directing on set is remote, which felt more like an airport control room than a physical production.”

Cihelna believes connectivity tools we use today – from Slack to Zoom – were built for a world prior to the outbreak of COVID-19, as a tool for remote meetings, as opposed to an alternative to in-person work. “As such, they lack spontaneity and don’t integrate into the way we work at all,” he said. “We spent a lot of time sharing screens while working and allowing various members to remote-control screens to solve small issues like picking specific colours.”

Alternating timezones added another layer of complexity to proceedings. “The livestream lasted until early morning US Pacific time. By the end of the project, everyone on the team was used to working across cultures and countries, which made things a lot easier,” he remarked. “Most constraints in this type of production come in the form of timeline, it’s important to note that the more time you have, the more you can test out new innovations in performance styles and methods.”

The event ran in Unreal Engine in real-time above 24fps on set. A virtual studio in Japan featured MIYAVI on a green screen set with three A, B and C Blackmagic Micro Studio 4K cameras and stage lighting that matched the team’s project in Unreal Engine 5.24. “We received an accurate 1:1 3D scan of the studio space beforehand and built our environments with it in mind. This meant we knew exactly where trusses existed in real life and could place virtual lights in the right positions,” Cihelna explained. “Miyavi was keyed and tracked in real-time on set and the two images composited together in real-time then fed into Twitch as a standard livestream.”

The event was based entirely around Unreal Engine in two pieces: the virtual “sets” and live production – both rendered in real-time like a video game. The sets were built in Unreal Engine as a game level. “We had scouting sessions to pick suitable spots for MIYAVI and tested out different camera positions, colour and environments before reaching the studio,” Cihelna recalled, explaining that there were 12 different prototype environments before reaching the final select few.

The second step involved a custom build of Unreal Engine by Zero Density called ‘Reality Engine’. “Since both are built on the same version of Unreal, we could easily switch between the on-set version and release version to update sets on demand,” he said. “The studio in Japan used Reality Engine to track blend virtual and real sets – from real-time camera tracking, light matching, to keying MIYAVI on a green-screen stage.”

Every scene and element in the engine needed to be performance optimised to account for the rendering, VFX, post-processing, keying, camera tracking and compositing all done in real-time.

“We initially set out to use ray tracing in Unreal Engine to improve render quality. After some initial tests, we discovered that we needed way more computing power than we had available at the time. I think it’s definitely possible with some of the newer GPUs and CPUs that just came out, so we’re looking forward to using them next time,” he explained.

While a virtual audience doesn’t replace the energy and excitement of a ‘traditional’ live event, the possibilities available to clients in order to create a bespoke interactive experience are improving each day. “I think younger audiences and fans are just as excited about virtual interactive and livestream events as they are ‘traditional’ live events,” Cihelna theorised. “Interactive performances are a great way to stimulate our imaginations and find new ways to provide fans with unique digital moments and connections. Sharing events like this one is critical in a time when people yearn for social connection.”

Cihelna believes that growing up with instant messaging, video games and endless virtual content means that the lines between in-person and online are completely blurred. Chats on Twitch are just as filled with energy and life as crowds at festivals and concerts.

“The biggest realisation clients can make in this era is that they aren’t constrained by physical limitations when doing virtual events – anything is possible. Want your artist to float around a giant moonscape while fans from around the world are joining to watch on TikTok? Sure! That opens up an entire new world of interactivity, fun and social engagement – worlds we’ve never seen before in live entertainment,” he added. “This space is more exciting than ever and we’re just scratching the surface of what can be created.”

 

View this post on Instagram

 

A post shared by TPi (@tpimagazine)

‘MUSIC CONNECTS US BEYOND TIME, SPACE, LANGUAGE AND EXPERIENCE’

“We developed the concept of the show around MIYAVI’s wish to highlight the urgency of climate change and worked with MIYAVI to create a setlist and corresponding visual worlds that take him through a narrative beginning in an apocalyptic burning city, a solitary reckoning, internal transformation, and then outward healing,” Stoll and Jong explained.

All of this was created without the entire team being in the same location. “I was on the East Coast and everyone else was West Coast or Japan, so I operated on Japanese time for a week in order to be on call for all the rehearsals and the event itself,” Stoll reported. “The project was a great creative challenge, which the entire team embraced. I think we need constraints as they help us to solve problems and be creative beyond our limits. This leads to innovation.”

Jong’s biggest challenge was the sheer number of moving parts, as well as a new workflow for the creative team and the artist. “There were voices from the client end at Amazon, MIYAVI and his label, our producers at Mothership, the team at Cyberhuman Studios where the physical production took place, our team at Pyramid3 who handled the virtual art department, sound and music, and language, time and geographical barriers to boot,” he recalled. “We had to get everyone on the same page for every step, and Mothership handled it like champs. Needless to say, there were a lot of weekly late night video calls!”

Despite the litany of logistical challenges, the directorial duo believe that it is integral to create interactive performances and keep music alive during this difficult time. “Music connects us beyond time, space, language, and experience. We need to be connected to each other now more than ever – the virtual world allows us to remain connected in new ways and stay safe during the pandemic. Music also gives us a lot of comfort and inspiration and during these times, that’s a lot,” Stoll explained.

“My favourite moment of this project was witnessing the fans interact with MIYAVI and engage and embrace the technology. It was really inspiring to see cross cultural teams working together around the world to connect music with fans. I hope we all keep learning and evolving as technology progresses.”

Jong, who has plans to make a webGL virtual album listening experience and make some progress on a vinyl art project amid the lockdown, added: “Artists and clients have been more receptive to new ideas and virtual mediums during this time, and the adaptability and resilience is inspiring.”

Stoll and Jong also collaborated with EEG artist Bora Aydıntuğ to collect and interpret MIYAVI’s brain waves while he meditated on the future of the world. These visualised brain waves appear during the performance, where MIYAVI calls upon his fans to join him in healing the world with their united wills. By creating space for this conversation within an interactive medium, the performance highlights the ability of humans to empower each other to take actions within our community that can expand into a worldwide movement.

“As professionals, we can sometimes get too caught up in execution, camera movements, colour, transitions, et cetera, and it’s always a wonder to see the final piece come together through the eyes of the fans,” Jong commented. “I was thrilled to be able to bring on Bora, the EEG artist who we collaborated with to interpret MIYAVI’s brain waves.”

Cihelna was equally excited by the integration of EEG brainwaves into the live performance. “It took a lot of R&D to figure out how to turn raw brain waves into usable data in 3D space, but we developed a method using formulas that turn the raw data into animation data that ultimately controlled a particle system in Unreal Engine. The result looked great and brought with it an intriguing story and conceptual exploration.”

‘CONCEPTUAL EXPLORATION’

As Founder and Technical Director of Pyramid3, Cihelna strives to push artists and clients into using up-and-coming technologies to create new experiences for their fanbase. “We’re going to continue to work on projects that blend virtual and real using game engines, machine learning, DIY motion capture and more.”

Having successfully merged ground-breaking technology and music to create a virtual live performance series, Cihelna is confident that the skills and expertise pioneered in this project will be harnessed in the future, even when shows can return to the masses.

“I don’t think this type of performance will stop when in-person events return. I think the market will boom over the next 10 years. The benefit is both creative and market – you can design any world or interaction you like but can also get anyone around the world to join in at any time,” he said, noting the expansion of the market for live shows, providing access to fans in all corners of the globe. “We’d love to bring more interaction into the performance by bringing game mechanics into the show – rather than live-streaming video, we could create an actual multiplayer game for the events that fans can join as players or characters.”

This article originally appeared in issue #260 of TPi, which you can read here.

Photos: Nagisa Kamiya

www.myv382tokyo.com

www.pyramid3.studio