Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Nine Inch Nails’ MainStage Tech Mat Mitchell Interview

In the November 2008 issue, we profile the current Nine Inch Nails tour from the front-of-house and monitor engineers’ perspectives. I got a chance to talk to Apple MainStage tech Mat Mitchell about using the new Apple system part of Logic Studio, which allows engineers and performing musicians to use software instruments and effects through a full-screen interface.

In the November 2008 issue, we profile the current Nine Inch Nails tour from the front-of-house and monitor engineers’ perspectives. I got a chance to talk to Apple MainStage tech Mat Mitchell about using the new Apple system part of Logic Studio, which allows engineers and performing musicians to use software instruments and effects through a full-screen interface. Mitchell first started working with the band in frontman Trent Reznor’s New Orleans studio, and then was brought onboard for their last tour, With Teeth, where he primarily did programming for Twiggy or Jordy White and taking care of their guitars and bass. He also helped Reznor create the first iteration of his vocal effects rig, which was basically ribbon controllers built into the mic stands so that the performer could do X/Y control of two different parameters of whatever effects were called for on a song-by-song basis.

On this tour, you’re using MainStage. Is this the first time you’ve used this program?
Yes, this is the first time that I’ve had a chance to use MainStage. When we first started production, we were looking at a few different products that all claimed to have the same or similar features, and the first thing that drew me to MainStage would have to be the familiarity—it instantly felt comfortable. It was easy to—short of any training or even looking through the manuals—just open the package and start working; it just felt comfortable. If you’re familiar with any workstation, it is smooth transition.

Were you doing programming during production rehearsals?
We did three months of production rehearsals where the band was learning songs and we were sorting through all the technology out there, figuring out what we wanted to use and how it would work in our scenario. We were fortunate enough to have the time to try everything out and see what worked for us, and that’s how we ended up with MainStage.

Walk me through your setup.
Basically, our monitor console [a Digidesign Profile] is the center section of everything, as far as audio, so everything hits the monitor desk. And then from the monitor desk, [monitor engineer Michael Prowda] can route on a song-by-song basis—he has his own matrix, so he can send out, for example on one song, I need to get a track from playback, he can get that to me. Say on another song I need Trent’s vocal, he can send that to me. So I can do external processing on different sources, depending on what song it is. That’s one of the aspects of what we’re doing, and it’s happening in real time. There may be one song the keyboard player needs to effect Trent’s vocals, so as soon as that song is called up, I’ve got a channel that is designated as an auxiliary input; any time that song comes up, the routing matrix sends the Trent vocal to me. As an instrument, it’s doing all the amplification for all guitar sounds, so guitars are being plugged straight into it and it’s generating 100 percent of the guitar sounds and all the guitar effects, wah pedals, delays, et cetera. It’s also doing all the keyboard sounds—anything from piano to samples to things like Reaktor we’re doing in MainStage. It’s the sound generator for both Trent’s system and for Alessandro [Cortini’s] keyboard system.

What plug-ins are you using?
We use all the Sound Toys stuff, all the Native Instruments: Guitar Rig, Kontakt, Battery, Massive—their whole bundle. Those are the big ones, and then of course all the Logic plug-ins. Apple has quite a good selection of high-quality stuff there.Mainstage is also used for mixing and processing, there’s a section of the set that we call “Ghosts,” and it’s more acoustic instruments. So there’s marimbas and different types of percussion, and all that’s being processed through MainStage because we’ve got real-time control of the plug-ins. So if we want to do delays or filtering, it’s a little more flexible for us and the players can automate what’s happening with those effects as opposed to…the front-of-house guy could do it, obviously, but it’s kind of fun and interesting to put it into the hands of the artist.One of the powerful things about MainStage is that it’s really easy to assign MIDI controllers to any parameter of any plug-in on a song-to-song basis, if you’ve got a little controller up there with eight knobs, those knobs can be anything from manipulating delays on a Logic delay and then it can be—just anything. We’re doing a lot of plug-in-parameter manipulation in real time.

So why isn’t this being taken care of at FOH if that engineer is on a digital board with access to those plug-ins?
I think it allows the artist to have more control over that kind of stuff. It adds another level of manipulation. It’s just like a guitar player having pedals where he can do wah or different types of effects. It’s kind of the same thing but taking it into the studio realm—the types of things you would automate in a mix session can now be controlled in real time during the show.

Does it change night to night based on the artist space the performers are in?
Yeah, I’d say so. I mean obviously there are limitations that we’ve set so that things don’t get out of hand, but definitely depending on moods and how they’re feeling that night, certainly types of manipulation they’re doing and the way it’s presented can be different from night to night. Most notable would be Trent’s vocal effects. We went from the ribbon controller to now where he has a switch embedded in his mic stand and also floor pedals and he’s doing parameter controls with that. Things like delays and pitch-shifting—those can vary quite a bit from night to night.All Trent’s vocal effects change quite a bit from night to night, he does a lot of looping where he’ll hit his pedal and it will do an effects send, and at the same time it will kick the feedback up to 100 and lock it so you can freeze looping and he does that quite a bit.

Are you in constant communication with the bandmembers during the show?
Not so much. We definitely have visual feedback, but most of that type of stuff we did in production rehearsals so we already know what they need and when they need it. It’s cool for them because they don’t have to have any interaction with computers at all; it’s just another instrument.

What are you running MainStage on?
We’ve got two different systems. Trent’s system is two XServes. The reason he went with the XServes is that they’re really fast; he’s using PCI-based audio interfaces and he’s able to get latency down to 32 samples, so there’s really no delay between what’s going in and what’s coming out and that was really important for being able to pull off the vocal effects.Alessandro’s rig is a MacBook Pro running an Apogee Mobile and Rosetta 800 which sounds incredible.

Do you think MainStage will become part of their studio arsenal?
I would imagine so. For me, I’m ready to get working on the next project because I want to use MainStage from the ground up in a recording environment, and if for nothing else, then just the ability to get in and attach controllers to plug-ins. Mainstage does this really elegantly and it’s really fast, so that’s something that in a recording environment would be nice to have.

Close