Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Recording

Capturing Concerts for VR

By Alvin Fernald. Supersphere, an entertainment production company that specializes in immersive and interactive live broadcast events, recently teamed up with mobile production specialist TNDV to create the Audiophile Series, a set of live concerts presented in VR with a real-time, ambisonic audio mix.

New Orleans, LA—The annual New Orleans Jazz & Heritage Festival, affectionately called Jazz Fest by fans, lures thousands of music lovers to the city each year. Being New Orleans, the music scene is alive with shows both part of and separate from the jam-packed official eight-day schedule.

Among the festival-adjacent events was the Audiophile Series, a set of shows that took place during the second weekend of Jazz Fest (May 2-4). The three-show series was conceived and produced by Supersphere, an entertainment production company that specializes in immersive and interactive live broadcast events. Supersphere has taken part in nearly 80 events in the past two years, including live streamed concerts featuring Vince Staples, Lupe Fiasco and Thievery Corporation, among others.

“We focus on virtual reality, augmented reality and mixed reality, but over the past two years we have focused almost exclusively on immersive live broadcast work,” said Lucas Wilson, founder and executive producer at Supersphere. “We deal with 180° and 360° interactive and non-interactive streaming with the philosophy that fans want to be closer to the events they care about. We offer the next best thing to being there at the venue.”

The seeds of the Audiophile Series were planted three years ago when the Supersphere team worked with the late, legendary producer Geoff Emerick on an ambisonic audio mix of seven short Paul McCartney pieces.

“Working with Geoff was a very special experience, and it made us realize that audio has been somewhat overlooked in VR,” said Wilson. “This generated an idea to create a real-time, ambisonic audio mix that would serve as the primary delivery, versus trying to convert a stereo or 5.1 mix. The idea became reality once Oculus Go came to market and made consumer VR practical.”

Fast-forward to May 2 of this year, the official launch of the Audiophile Series. Concert audio from the three shows—The Revivalists at The Fillmore (May 2), Preservation Hall’s 14th Annual Midnight Preserves show (May 3), and Galactic at Tipitina’s (May 4)—was streamed to the Oculus Venues platform. While the Supersphere team focused on the overall event production and the video elements, it brought on mobile production specialist TNDV and Grammy Award-winning audio engineer Mills Logan to handle the audio requirements.

“A real ambisonic mix captures the room and space, but it also brings out what is truly special about music,” said Wilson. “Doing ambisonics as a primary delivery with an experienced audio engineer and such a technically and creatively fluent audio production team was pretty much a dream come true.”

For Wilson, a well-done native ambisonic mix makes stereo sound “flat and narrow” by comparison. A stereo audio mix is traditionally a left-to-right experience—reverb and other effects create an adequate approximation of a space—while ambisonic audio evolves this capability by creating a true 360° model of the environment. Mixing natively in ambisonic means mixing in 360°, with the ability to specifically place elements in coordinated positions.

“You can place instruments anywhere in that space, which means the engineer really creates the environment around the listener,” said Wilson. “Then there is the ability to add ‘head tracking’ in VR headsets. This means that if a guitar player is in front, the listener can turn his or her head and the guitar player remains in the correct space. It’s almost as if there are loudspeakers encircling the audience. The TNDV team was ultimately responsible for bringing that to life for the Audiophile Series.”

The first step in making that happen was recording the audio; the TNDV audio team placed ambisonic microphones around the room to record the channels that would comprise the ambisonic sound field. TNDV deployed Sennheiser AMBEO, SoundField SF200 and HEAR360 8ball microphones, as well as a collection of traditional microphones from Neumann, Audio-Technica, AKG and Royer to capture the room exactly as Supersphere intended.

“We started by using all three ambisonic mics in an array at front of house, and then made decisions in each venue about how many of each microphone to use,” said Adam Ellis, audio engineer, TNDV. “In each venue, we ultimately had 16 channels of individual elements coming off the ambisonic array, combined with up to 12 point sources or spot mics.”

Ellis said that the SoundField SF200 produced the most consistent, high-quality sound from venue to venue, while the HEAR360 microphone provided more unique value. “The HEAR360 mic has eight individual elements arranged in four pairs around a sphere, which makes it incredibly flexible,” he said. “We chose to use it as a binaural mic because ultimately the ambisonics are rendered binaural when you listen back over the headset. That mic was important because we could produce a true binaural element within our mix.”

The audio setup was more straightforward on the front end: the TNDV team received splits from the band, taking in every input off the stage. “We mix the music very traditionally using our Studer Vista 9 console,” said Ellis, “but then we take in two layers from the ‘A/R’ microphones—the ambience layer and the response layer. We try to capture that audience response separately using very tight techniques, including shotguns and small condenser mics.”

For the ambience layer, Ellis and his team go high and wide in an attempt to communicate the unique character of the room. “In each venue, we try to capture two or three unique elements and get them into the spheres,” said Ellis. “It’s not so much about communicating the experience 100 percent accurately; it’s about identifying the cues that are unique to the space and using them effectively.”

Stationed on TNDV’s Vibration audio truck, Logan and Ellis managed the live audio production for all three events. “Mills has been making records in Nashville for around 30 years,” said Ellis. “We brought Mills in specifically to make the music sound amazing within each space, so Mills was very focused on the band, how they sounded and making sure that their message was properly communicated.”

Ellis felt the job would take three people to do properly: Logan was responsible for creating the live multichannel mix; Rob Horne, Ellis’ colleague at TNDV, set up the microphone arrays, with a special focus on the ambience layer; and Ben Adams was in charge of monitoring and delivery of the ambisonics.

The audio monitoring workflow leveraged Waves NX Virtual Mix Room and NX Head Trackers using Sennheiser HD650 and Ultrasone Pro550 headphones, allowing Adams to simulate the experience that audiences would hear at home. This was especially important because the ambisonic mix was created natively with no conversion.

“We mixed and monitored in that format, instead of creating a 5.1 mix and then translating it to an immersive format,” he said. “We wanted this to be as completely linear as possible. When you add conversion, the audio becomes nonlinear and is subject to distortion. We eliminate the conversion stage by keeping it native, adding to the overall precision. Ultimately our goal is for the workflow to have as few steps as possible.”

One of the last steps in that workflow was embedding the audio with the video content in the final stages prior to delivery. “We’re sending up a 4K stereoscopic video image with four channels of embedded audio, which is the key to first-order ambisonic sound,” said Ellis. “It’s four channels, but a phased matrix, similar to Pro Logic II audio. The first channel holds all of the amplitude information, and then you have a height channel, a side-to-side channel, and a front-to-back channel. The headset then uses its tracking data to create a binaural render based on the position of an audience member’s head.”

The Supersphere video workflow includes an array of strategically positioned VR cameras in the venue—this generally includes a mix of Insta360, ZCam and custom models—and a mobile rack of gear that includes encoding systems and software-based production tools. Keycode Media is the systems integration company responsible for building out the racks, which vary in size based on the scale of the gig. For the Audiophile Series, two 400-pound racks of gear were forklifted onto the Vibration truck, according to Ellis.

While VR may still be an emerging technological format, the Audiophile Series’ delivery point proved to be surprisingly traditional. “We take TNDV’s audio feed over fiber and integrate it into our video stream, and deliver H.264 and H.265 video with embedded AAC audio to our satellite truck,” said Wilson. “From there, it goes to a network operations center that encodes and distributes the stream to our endpoints. For the Audiophile Series, we broadcast into an app called Oculus Venues, and the Oculus team manages delivery to headsets.”

Wilson describes the delivered product as like sitting in a giant IMAX theater. “When you put on the headset and look around, the seats are filled with digital avatars,” said Wilson. “These are people you can actually interact with. There’s a microphone and speakers in the headset, and a giant screen in front that wraps around what we are broadcasting. There’s a geometry to sitting in a theater, and we make sure the cameras match that geometry so that there is a natural look and feel. But for the Audiophile Series, we needed that real-time, ambisonic sound to make this a truly immersive audio event. We trusted TNDV implicitly based on their audio experience. All we really needed to say was, ‘Make it sound pretty.’”

TNDV • www.tndv.com

Supersphere • superspherevr.com

Close