Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Mix Blog Live: I’m Surrounded

Last week at InfoComm in Las Vegas there was a noticeable presence of product geared toward “live immersive audio.” The point of immersive live audio is to move beyond the usual left-right stereo speaker model and enable an engineer to more precisely match a performer’s location in the mix to their location on stage—in three dimensions. It’s kind of like Dolby Atmos® for live sound.

Of particular interest was an announcement made by Avid that several of their development partners including Flux::, d&b audiotechnik and L-Acoustics have introduced AAX plug-ins that run natively on the Avid VENUE | S6L. The plug-ins are designed to provide control over their manufacturer’s respective immersive surround systems via the S6L and VENUE software. Avid presented demos at InfoComm showing off integration of the S6L with the plug-ins managing each developer’s immersive sound system.

Last year DiGiCo announced native object-based mixing via L-Acoustics L-ISA Source Control using its SD consoles; earlier this year d&b audiotechnik announced that DiGiCo, Lawo and Avid will integrate d&b Soundscape’s DS100 Signal Engine using plug-ins, enabling object-based mixing and acoustic room emulation from their control surfaces.

So what does this mean for live sound? Well, for one thing it provides an artist the ability to enhance the listening experience for their audience. Some of you might remember seeing Pink Floyd back in the day when they used surround arrays at their concerts… it was quite impressive, and nowhere near the ambition of immersive systems. The Lorde Melodrama Tour is already on the road this year, marking the first international tour employing L-Acoustics L-ISA technology.

Like with Atmos, an immersive P.A. system could easily include left-right, front-rear and vertically oriented channelization. Given the right program material, it should be able to make your hair stand up. Imagine being able to route audio at an EDM show that’s as high as the audience(!). The integration of three-dimensional routing into a familiar work surface such as an S6L or SD10, and object-oriented mixing (as in the L-Acoustics L-ISA system) is a major selling point because the learning curve for engineers will be way gentler than if we had to learn a completely new interface.

The other side of the fence will have accountants worrying about trucking 40 arrays around the country instead of four, and they may have a valid point. But if the music industry wants to remain competitive with the videogame and motion picture industries, we have to up our game. We need to give a 20-year old an enticing reason to go see a concert instead of staying home and playing a game.

Certainly immersive audio bodes well for installed sound systems, but what about touring? There are problems to be addressed: many theaters simply cannot handle the hang weight of large numbers of arrays. Larger venues such as sports arenas don’t provide a friendly acoustic environment for stereo arrays, so what will happen when we put 30 arrays into the building? Maybe with careful deployment of multiple arrays we can reduce the volume level of each array and offset the myriad of room reflections we usually combat.

And what about setup and tear down time? Would a major tour employing an immersive audio environment need to have multiple rigs on the road, where one is in use, and the other one is being constructed in the next city?

These and other important questions will be addressed in the next few years. It’s still in its infancy.

Close