Behind the screens of Familia: Welcome to The Family

XR Studios harnesses immersive technology tools by Megapixel VR to create a ‘Hispanic Alice in Wonderland’-inspired virtual concert to promote Camila Cabello’s latest studio album.

On the eve of the release of her third studio album, Familia, Camila Cabello announced the launch of an online event, Familia: Welcome to The Family. The singer-songwriter collaborated with TikTok and Epic Records to produce the concert with creative direction by Charlotte Rutherford and XR Studios.

XR Studios used a suite of immersive technology tools, created in collaboration with Megapixel VR, which enabled the choreography of animated backgrounds captured in real-time in XR Studios’ LED environment. The wider creative team included Creative Director, Paul Caslin; Camera Director, Sam Wrench, and Silent Partners Studio.

“We found a way to explore the themes of Camila’s Familia album in a visual way that would not have been possible in a standard live performance to create the effect of a Hispanic Alice in Wonderland,” XR Studios President, J.T. Rooney explained.

XR Studios used LED processing platform, HELIOS, to manage virtual content in real-time for a VR shoot in XR Studios’ Hollywood studio location. This enabled Rutherford and Wrench to use multi-camera setups to capture performances in six virtual set pieces, occasionally incorporating dancers.

Production Designer, Liam Moore conceptualised each of the six songs. Silent Partners Studio designed and created 3D graphic set extensions and effects, which XR Studios incorporated into XR environments.

“Our stage and technology are based around what we call ‘set extension’,” explained XR Studios Chief Technology Officer, Scott Millar. “Whenever the edge of the LED comes into the shot, we fill that in with virtual content.”TikTok’s portrait-style mobile app format set the proscenium for a 9:16 aspect ratio, which XR Studios captured with 6K RED Komodo cameras in native 2:1 format.

Physical props included the corridor of doors as seen in Bam Bam, where Cabello pushed open a door into a cantina setting. As the camera pulled away from Cabello, seated, XR Studios projected animation of crumbling walls, a glowing floor, and a vertiginous sky. Another prop served as a linking device, seen in Psychofreak, where a giant pair of red lips offers Cabello a portal into an abstract realm of dancing geometric shapes.

Silent Partners Studio spent two months before the shoot digitally modelling in Unreal Engine and Notch. XR Studios then ingested material into disguise media servers in preparation for real-time playback on set via HELIOS. This workflow allowed XR Studios and Megapixel VR technology to feed XR playback to the LED wall using GhostFrame to generate correct perspectives of backgrounds for multiple cameras without visual anomalies associated with frame remapping of images.

“GhostFrame allows us to select which imagery we want to hide, and then through a patented process we dynamically create an inversion of that feed, so people on set only see one source playing,” explained Megapixel VR Product and Project Manager, Scott Blair.

“HELIOS processor has a very consistent web-based, multi-user tool for editing on the fly to get the LED processor up and running,” added Millar.

HELIOS also provided dexterous solutions for managing on-set imagery. “This specialised workflow wouldn’t be possible without HELIOS’ pedigree in colour accuracy,” commented Jeremy Hochman, Founder of Megapixel VR.

XR Studios used Stype RedSpy camera tracking to calculate camera positions. This delivered virtual backgrounds to four cameras – including Technocrane and Steadicam setups – within XR’s 54ft-wide, 35ft-deep, 20ft tall LED volume. XR imagery streamed to the wall and floor sampled 8K outputs, which HELIOS divided into manageable segments.

XR Studios used HELIOS’ Application Programming Interface [API] to enable GhostFrame inside the frustum – or field of view – of each camera.

“This allows for a comfortable experience in person while enabling crossfades between multiple cameras,” added Hochman. Taking cues from background imagery, directors and their lighting teams made use of ARRI SkyPanels, moving lights, and ultraviolet lighting – occasionally revealing Día de Muertos-style u/v pigment makeup on performers, in La Buena Vida.

“We had to make sure that the spotlight on the talent was never hitting the back wall,” said Millar. “When we had 20 dancers on set, that also dictated the scale of the stage.”

Psychofreak – which featured the performer in a lemon yellow catsuit with four black-and-white-striped dancers in a kaleidoscope of shifting geometric forms – gave the XR Studios and Silent Partners Studio teams their most creative freedom.

“Set extensions only work when the talent is surrounded, there must be LED underneath the talent on the floor and behind them. We used that to do a trick where the dancers were on stage and we put augmented reality on top of them, leaving Cabello open. As Cabello was dancing, we placed content over the top of the LED content. When we removed that content, we revealed the dancers, who were on stage with her the whole time,” Millar explained.

Dexterity was also a key feature of the production for the filmmakers and the XR Studios team. “We’re working in a new world now where the camera is affected by the LEDs,” Millar affirmed. “Everything is connected in a way it hasn’t been before.”

The connections were made possible by close cooperation with technical support. “Megapixel VR understood from the start what we needed, they were honest and proactive in development and responded to our requests for both suitable features and the support we needed. The flexibility of the system, the API integration, and the robustness have allowed XR Studios to build our tools to always integrate with HELIOS with confidence and support.”

This article originally appeared in issue #276 of TPi, which you can read here.