Astro Spatial Audio Debut Blacktrax Integration at InfoComm 2019

Independent solution for true object-based audio company appeared on partners’ booths at Orlando exhibition with BlackTrax integration and more.

Astro Spatial Audio debuted full integration between its industry-leading true object-based audio technology and the renowned BlackTrax real-time tracking technology at InfoComm 2019.

Based on a ground-breaking new algorithm which seamlessly converts Real-Time Tracking Protocol (RTTrP) signals to Open Sound Control (OSC) commands, the BlackTrax integration will form the centrepiece of a busy show for Astro Spatial Audio. InfoComm visitors will have numerous opportunities to experience true object-based audio in demonstrations taking place across partner manufacturers’ booths.

The BlackTrax demonstration was featured in the demo room of Martin Audio, whose Sound Adventures immersive audio system has the Astro Spatial Audio SARA II Premium Rendering Engine at its heart.

Meanwhile, the Alcons Audio demo room featured the Pro Ribbon Immersive Experience with Astro Spatial Audio. As well as numerous listening opportunities, visitors were treated to special presentations by legendary front of house engineers Buford Jones (Pink Floyd, David Bowie) and Robert Scovill (Tom Petty, Prince).

Developed in response to high market demand, the arrival of BlackTrax integration further reinforces Astro Spatial Audio’s entirely brand independent philosophy which has already seen it become the trusted choice of engineers, sound designers and partner manufacturers alike. Designed to give audio professionals the freedom to choose the loudspeakers, consoles and third party brands they want to use, with Astro Spatial Audio the only limit is your imagination.

The heart of the Astro Spatial Audio solution is the award winning SARA II Premium Rendering Engine. Measuring just 3U but delivering up to 128 MADI or 128 Dante configurable network pathways, the SARA II Premium Rendering Engine converts audio signals into audio objects and uses extensive metadata to precisely calculate object positions within virtual 3D space over 160,000 times per second, as well as that object’s acoustic effect on the virtual space around it.