Jean-Michel Jarre: Welcome to the Other Side

The city of Paris commissions electro-pioneer Jean-Michel Jarre to produce a virtual show to welcome in 2021. Viewed by 75 million people via VR headsets, TV coverage and across various social streaming platforms, TPi’s Stew Hume speaks to the creative team behind this pioneering project…

It’s a few weeks into 2021 and TPi is chatting over Zoom to a key creative involved in one of the largest-ever virtual gigs assembled, from the comfort of his home in the Scottish Highlands. Instead of merely discussing Jean-Michel Jarre’s latest pioneering stunt, Lighting and Show Designer, Jvan Morandi begins by discussing an online video game, Blitz War, which he and his friends had been playing until the early hours of the morning prior. There is a very good reason for this. According to Morandi, there is something which fellow industry peers, promoters and artists can learn from all-night gaming sessions.

“When it comes to virtual shows, it’s not the amazing effects or impressive CGI that keeps an audience’s attention – it has to be engaging, otherwise you’ll switch off after 15 minutes,” he commented. “There is a reason why my friends and I stay up for hours playing a rather rudimental 8-bit video game. It’s because we were participating in something.”

Let’s backtrack for a second. Last year, instead of putting on its traditional New Year’s Eve celebration, the city of Paris commissioned Jean-Michel Jarre to create a virtual event to welcome in 2021, with viewers able to watch from the safety of their home. According to Sony Music International, the event has now garnered more than 75 million views across social media platforms including Facebook, VRC, Weibo and TikTok.

The concert lasted 55 minutes and took viewers into a virtual experience with a digital replica of Notre Dame Cathedral. The performance could be viewed via a VR headset in which gig-goers were able to move around the virtual Notre Dame and interact with the surroundings. Alternatively, viewers could tune into a live cut of the virtual show, which spliced real-time footage of the artist in his studio with an animated JMJ avatar performing within the 3D model of the Cathedral and Notre Dame.    

“This entire show came from the mind of Jean-Michel,” explained Morandi. “It began as an experiment, discovering how a virtual show could be more dynamic and engaging.”

With COVID-19 dampening any prospect of a traditional New Year’s Eve celebration, the pressure was on for both the artist and a small team of creatives to turn this experiment into ‘the’ national celebration. Morandi was approached by Jarre in the early stages of this experiment, aware that the LD had already invested a great deal of time honing his skills with gaming engines.

“I have been getting my head around the technology throughout 2020,” he reported. “Jean-Michel knew I was working on this and asked if I could bring the speed and efficiency of the live events industry to help develop this show so it wouldn’t just look like a video game.”


View this post on Instagram


A post shared by TPi (@tpimagazine)

Before working out how to create a show within the gaming engine, Morandi and the team went through a number of conceptual ideas, all of which he designed on Capture. Morandi joined forces with the Producer, Louis Cacciuttolo of VRrooms and 3D Animation Artist, Vincent Masson of VMD. 3D Modelling and Unity Game Engine specialists, Lapo Germasi and Victor Pukhof from Manifattura Italiana Design, alongside Unity Scripting and Tech Supporter, Antony Vitillo were key to the creation of the show.

“The team from VRroom have a huge amount of experience when it comes to VR and were incredibly helpful converting the ideas we had in Capture and bringing them to the Unity Engine,” stated Morandi.

Despite working in this new format, Morandi was able to use some typical tools of the trade in the creation of the show – namely a ChamSys MQ80. “All the cue lists came from ChamSys and were translated into a set of Unity animation triggers on a timeline. When I say ‘translated,’ I mean that Victor Pukhov used the visualised lighting cues to create ‘shader animations’ that then got triggered by Unity custom scripts by Antony Vitillo,” he recalled.

He was quick to point out that the basic principles of stage and show design apply to creating a show in the virtual realm along with some notable differences – specifically how you have to treat effect lightings. “In a gaming engine, stage light beams are not real light but volumetric shaders; 3D objects with a texture.”

This means to create any big lighting looks, Pukhov from Manifattura Italiana Design had to mimic any moves which Morandi had programmed with the ChamSys desk. “This whole project has all hinged on finding ways of clear communication, which is not easy because of often different workflow styles between show business and game development,” he commented, recalling the many months it took to bring the show to life. “A great deal of credit has to go to Louis Cacciuttolo from VRroom, our French VR Producer; and Antony Vitillo of NTW, the Italian developers that looked after all the scripting and game engine functionalities. Jonathan Klahr did an amazing job on the 2D video content mapped onto the interior walls. Stephan and Jeroen from LaserImage from Amsterdam programmed the initial laser sequences,” he added.

“[360° VR Camera Director] Georgy Molodtsov, [VR Camera Director and Editor] Maud Clavier and [Live Broadcast Director] David Montagne did a great job filming the show all in VR.”

If having a number of creatives dotted around the UK and Europe wasn’t impressive enough, one of the camera directors, Georgy Molodtsov, operated from Moscow along with Maud in Paris. Each director oversaw up to eight remote VR cameras and drones within the virtual world.

Responsible for inserting Jarre into the virtual world was TV Studio Gabriel, Paris, where the artist performance was translated live with motion capture technology and then mirrored by an in performance avatar that had been created by Freddy Kone and Mo Marouene of SoWhen.Morandi believes that if this type of show was to exist alongside, say, a tour, then the speed at which a VR show would be created would have to match up with that of a live tour.

“One of the interesting differences we have in live touring compared to those working within the gaming industry is timescales,” he told TPi. “A regular show might take between three to six months to design, whereas video games by their very nature can take up to seven years to design.”

Morandi shared strong opinions on the importance of the entire live events industry embracing the wave of virtual events. “The simple fact is that the companies behind the major gaming engines do not know the world of live events,” he commented. “Right now, we are seeing a mirroring of what happened with the record industry decades ago. You look now and the major streaming companies – Spotify and Apple – have not come from the music industry.”

To close, he emphasised that, as a sector, the events industry must ensure a similar situation does not happen and that show creators, lighting designers and video creatives continue to be part of the conversation.

This article originally appeared in issue #258 of TPi, which you can read here.

Photos: Vincent Masson and SoWhen