Movie Sound Effects | Cars! Weapons! Machines!

Among the sounds created by Harry Cohen are the guns in <I>Inglourious Basterds</i> (pictured), gadgets in <I>Green Hornet</i> and various sound design in the upcoming <i>Apollo 18</i>.

Among the sounds created by Harry Cohen are the guns in Inglourious Basterds (pictured), gadgets in Green Hornet and various sound design in the upcoming Apollo 18.

From the clang of a sword to the roar of a monster to the rev of a car engine, Hollywood directors depend on sound designers and sound effects editors to craft the sonic elements that help add impact and interest, set the mood or ratchet up the terror of a scene. Working with Foley artists, re-recording mixers, composers and others, the creators of film sound effects have challenging jobs that require imagination, creativity and technical abilities, not to mention a great ear.

There are two primary job titles for those who create and edit effects—sound designer and sound effects editor—though the differences between the two job descriptions have become blurred over time, and both are essentially involved in effects creation.

To learn more about the techniques used to create effects for films, Mix spoke with three pros at Soundelux (Hollywood), all with sound designer and sound effects editor credits to their name. Harry Cohen has worked on such titles as Inglourious Basterds, Star Trek, Robin Hood, The Green Lantern and The Perfect Storm. Chris Assells has credits on films like Fright Night, The Green Hornet, Wall Street: Money Never Sleeps and Gladiator. Jon Title’s filmography includes Final Destination 5, Red, Blood Diamond and The Bourne Ultimatum.

Before delving into specifics, it’s instructive to mention a few general points that all three of the interviewees agree on. The most important is that every film is different, and a sound designer must base his/her approach on the needs of the particular film and the director’s vision for it. “With almost every problem, everything that we tackle,” says Cohen, “it always goes back to the story.”

The three interviewees also concur that the best source material for effects based on real sounds are custom field recordings, made with the needs of the film in mind. That said, a great deal of layering, processing and pitching up or down often gets done to these recordings before the final mix.

“We do a lot of recording for every film,” Cohen says. “There are a couple of reasons. One is that no matter how much you have in the library, it seems you never have exactly what you’re looking for. And the other is that we look at a scene and we talk about what it is we want to accomplish with sound. When we go out to record, we’ve got that in mind, and we’re recording things in a particular way, specifically for that instance.”

If there is no time or budget for custom recordings, quality libraries can provide pretty good substitutes. “I worked in low-budget places early in my career where it was just library effects,” Assells says, “and you do what you can. To make it more dynamic, you’ll pitch it way up or you’ll time-compress it way up—something to make the sound really pop.”

When building and cutting their sounds, sound designers and sound effects editors are always mindful that the other important sound elements will also be occupying the soundtrack. Sometimes the key is what not to include. “If we’re in the middle of a car chase and a gun fight,” says Cohen, “you’ve got three car engines, the gun, the tire skids, the impacts, the ricochets, the dialog and the music. If we present all that to the audience, then they hear nothing; they hear the mishmash. Then we can start saying: ‘What can we take out?’ ‘What do we want the audience’s experience to be here?’ So one of the things we would realize is if we take out the engine in a couple of these shots, now we can hear the guns.”

The balance between the various sound elements is handled on the mix stage by the re-recording mixer, so effects and levels often change quite a bit during a final or one of the temp dubs. “Nine times out of 10, the [sound effects] mix I prefer is way hotter than the one the mixer prefers,” says Assells. “Dialog is king, then music and then effects. That’s usually the way it goes.”

Sound effects are synched to picture and edited, and the effects sequences frequently contain large track counts. Sound designers or sound effects editors will often make 5.1 submixes of their work, using the internal busing available in Pro Tools, to present to the mix stage. “They’ve got tons to do and hundreds of tracks to deal with,” says Cohen, “and if I bring in something that is 200 tracks for this one idea of making the character of the machine change, then I’m not helping the process. We’ll bring it in as a complete rendered thought—a series of thoughts, maybe, a couple of different ways to go—but much more well developed and comped down than all the individual elements are.”

These submixes are especially important during temp mixes, where time is even shorter than at the final. “There just isn’t time to go through 60 tracks,” says Title. “When there’s more time and you’re predubbing for the final, those mixes can be unwound, and the mixer can start from scratch if he wants to. But a lot of times, the 5.1 submixes sound fine.”

Chris Assells created the sounds for the sword fight scenes in <I>Gladiator</i> and worked on the <I>Green Hornet</I> (pictured).

Chris Assells created the sounds for the sword fight scenes in Gladiator and worked on the Green Hornet (pictured).

Gunshots are a big part of many action films and require a lot more than just the placement of a gunshot recording in the appropriate spot on the Pro Tools timeline to sync with the action on screen. “In real life, most guns just go, ‘pow,’” says Title. “But in the movies, for the most part you don’t want it to just go ‘pow,’ so you add these other elements to give it texture and make it exciting.”

Recordings for gun sounds can be quite involved. As an example, Title mentions his work on Black Hawk Down, where the guns were recorded in a setting that was acoustically similar to the Mogadishu, Somalia, streets where the film was set. “It all takes place in an urban environment between buildings,” Title says. “We went out and recorded all the main weapons for that film: the AK-47s, the M-16s, and the mini-guns and the 50-caliber. There’s a bit of a slap on those weapons that is natural, that came from where we recorded it.”

“There’s often eight to 10 [recorders] going from different positions,” adds Cohen. “You’ve got one that’s kind of on the side of the gun so you try and record the mechanism. There are a couple that are further away to get the boom. And some that are downfield. And some that are close to the muzzle but in back of it so the pressure wave doesn’t hit the capsule. Then we’ll take all these recordings back, and the librarian will line them up so all the recordings are in sync to each other. And that gives you a wonderful tool kit to create an interesting shot.”

The editor can then choose a specific combination of these synched recordings to use for that single gunshot sound. “Maybe I’ll have one very tight, high-frequency, snappy gun in the center,” says Title, “and then maybe [layer in] a boomier stereo pair for the left and right. And then depending on how close it is, if it’s a big, in-your-face gun, maybe I’ll use a sweetener with a subwoofer element.”

Especially in a complex battle scene, the 5.1 panning of a gunshot can be quite important for sound placement and impact. “On close-up guns, I use a center-channel element so it’s right there, connected to the middle of the screen,” Title says. “For the most part, my general rule is to do an L/C/R and a sub. Sometimes not a sub, but definitely an L/C/R. If it’s a gunshot right in your face—a close-up of someone firing a gun—it really attaches to the image. For medium and distant sounds, I’ll use a stereo pair or just a mono. I can use that same gun, but choose a distant recording of it.”

Cohen says compression plays a big role in making gun sounds work in the movies. “We can’t really re-create the exact experience of being in the presence of the sound of a real gun,” he says, “because there’s this huge pressure wave that’s damaging to your ears. So we have to kind of imply that by making the large part of the sound last longer instead of a single transient at maximum volume. We’ll work with the sound to keep the front part of it at maximum volume for longer than a single transient. We can’t really make it louder, but we can make the loudest part of it last a little longer.”

For this application, Cohen typically sets high ratios and low thresholds on his compressor plug-ins; McDSP’s Compressor Bank and ML4000 are among his favorites. “All the big Hollywood-sounding guns and big punches and stuff, we’re listening to a lot of compression artifacts,” he says. “You just want to find compressors and limiters and things that behave the way tubes and tape did, and give us that large warm sound, which the audience has been taught represents size and power. The sound of a Hollywood punch, a chin sock, that doesn’t exist in the real world. It’s a compression artifact.”

There are other elements to gunshots such as the sound of the bullets flying. “‘Bullet whizzes’ or ‘whiz-bys,’” says Title, “are those little ‘chu, chu chu’ sounds. If you watch a movie like Glory, they really played up the sound of the bullets going overhead. And it really added to the tension. And then there’s the bullet impacts, which also represent danger. And you get the sound of bullets not only impacting and going into things, but also ‘ricos,’ ricocheting off of things. And that whole package of the gunshot, the whizzes and the bullets impacting or ricocheting is what really makes the gun battles exciting.”

Other types of weapons are also created as larger-than-life sounds and often have multiple layers. Assells talks about working on the sword sounds for Gladiator, where he started with sword recordings from Soundelux’s extensive library. “Sometimes the sword sounds aren’t as big or as heavy as you want so you can pitch them down,” he says. “You could add sounds such as bells hitting.” He also added whooshes, which were synchronized with the arc of the sword movements. “If you watch the picture again and listen,” Assells says, “you’ll hear that whoosh on almost every impact.

“These gladiator battles were so full of stuff that you try to find a place in the soundscape so it would kind of sit by itself,” Assells continues. “An old sound editor trick is to put a gap before the impact sound, an actual gap of silence, because it makes the sound hit bigger. We use this a lot in gunshots and gun battles, where a gun will be fired rapidly and you’ll actually make a little empty frame or two of no sound at all so that when the sound hits—boom, it smacks you, and the previous sound is not bleeding into the next sound.”

Jon Title recently finished work as sound designer for <I>Final Destination 5</i>, in which he had to create the myriad sounds of a massive suspension bridge collapsing. He also worked on <I>Black Hawk Down</i> (pictured).

Jon Title recently finished work as sound designer for Final Destination 5, in which he had to create the myriad sounds of a massive suspension bridge collapsing. He also worked on Black Hawk Down (pictured).

Another major effects category is car sounds. Assells describes how for many movies, budget permitting, there will be a custom “car series” recorded. “A car series means they’ll take a particular car and record everything it does,” he explains. “So you’ll have a library of a 5-mph-by, a 10-mph-by, 15—all the different speed-bys. It’s backing up, slowing down, chase driving. They’ll do a parking-lot-maneuver series where it just kind of creeps around a parking lot, slowing down and getting quicker.”

Recordists will place mics in the tailpipe, the engine and the interior. The different recordings can be synched and used in any proportion that works for the given moment in the film. “Let’s say you’re driving along,” says Assells. “It’s a dialog scene, two characters are talking. You can use the interior recording [for background], it’s kind of washy, but it’s got a nice spread to it, you’re not crowding that middle speaker [where the dialog is]. And then he says, ‘Oh no, the bad guy’s after us,’ and we’ve got a chase. He stomps on it, then you can bring in the tailpipe recordings, which are nice and beefy and throaty.”

Although realism is important as a baseline, more often than not the recordings (or library recordings) get beefed up considerably. “The way films are now,” Assells says, “cars are over the top, and there’s CGI, and they’re doing things no real car would do. In those cases, I’ll pitch stuff down to give it more heft. Or I’ll EQ it or run it through plug-ins just to give it more bottom and more boom.”

Car sounds have many elements, and Assells likes to do separate submixes of the various categories to give the mixer ultimate control. “There will be the engine category that’s going to the engine predub,” he says. “And this is so when we final, if they want to just bring up the engine, they can. The tires will be separate, they’ll go under another predub. Rattles/impacts will go under another predub. Wind/buffeting will go into another predub. And then we have another one we call Design/Special Effects. That will go into another predub.”

Assells says that tire recordings are key to creating a convincing car sound. “A big item that we use a lot is called ‘gravel roll,’” he says. “It’s basically somebody with a mic, pointing down at a tire rolling on gravel. And we’ll use it on parking lots, on asphalt, and even though it’s not gravel on the asphalt, you play it at the right volume and it gives a sense of movement and it glues the car to the ground. Even if I’m cutting a car driving along at 35 miles an hour, I will always cut a tire steady, even if you’re barely playing it at all.”

A classic technique for making car chase sounds more exciting is to subtly layer in a lion (or other large animal) roaring to add to the tension of the sound. Although that’s become somewhat of cliché, it can be very effective. “If you mix it right, it’s kind of subliminal,” Assells says, “so you just kind of sense something’s there.”

Title agrees. “I don’t want anyone to go, ‘Oh, I heard that lion in there,’” he says. “But for some reason, those sounds, because they’re organic, are very powerful; they feel dangerous.”

If a machine is more of a prop than the “star of the show,” the sound team has more latitude regarding how much sound to include for them. “I have this theory,” says Cohen, “that in normal life, when you’re listening to something, your brain and your ears tune out anything that they start to think is unimportant. If there’s a Coke machine outside your office door, after a while you won’t notice it. But in film sound, we have to be that filter for the audience.”

He put that theory into practice in his work for the upcoming movie Apollo 18. “The movie spends a lot of time in the Lunar Excursion Module, the lander. So all of the background machinery becomes like a character,” he says. “We have to constantly change the level of these elements in the background because your brain won’t. So we’ll come into the scene and we’ve made all these elements that sound like heartbeats, or fetal heartbeats, and they’re like the oxygen scrubbers and other little pumps and mechanisms that turn off. We have to decide moment by moment what we want the audience to hear. And we have to do it in a way that’s a little unnatural. It’s not just deciding that the background is at this level and that’s how we run it through the scene. We want to keep changing the depth of the background. Or we want to maybe introduce it loud and then bring it lower, and then bring it up later on for dramatic reasons.”

When a machine is front and center, the audience is visually focused on it, so sounds must be designed and cut to match its movements. Cohen describes such a situation in his work on the movie Wanted. “There are these scenes that revolve around this big power loom. There was a basic rhythm that was built into the production track and we would use that as our guide,” he says. “I’d come up with a bunch of [looping] rhythmic elements.”

Cohen faced the challenge of how to get all the various loops he created to stay in sync, not only with each other but with the visual motion of the loom, which wasn’t always consistent due to picture edits. His solution was REX loops because they can change speed without changing pitch, and can follow a tempo map created in Pro Tools. “I literally had 100 different loops that would all be in rhythm,” Cohen says. “We would have heartbeats in sync with the loom, we would have train sounds in sync with the loom, we’d have all these different elements in sync with the loom. I gave some of those pieces to the composer, Danny Elfman, early on so that when the music showed up, it was based around that rhythm, too.”

For sound designers and sound effects editors, a lot comes down to the balance between realism and entertainment value. “For the most part, you’re making a movie, it’s not real,” says Title. “Punches in fight scenes would be a good example of reality versus movies. And it’s something that we struggle with because most of the time, filmmakers don’t want to hear big, over-the-top, Rocky, Hollywood-style chin socks. They want to hear something real. Actually, they don’t want to hear what’s real, they want to hear something that’s more realistic, that isn’t over the top, but still they want it to be exciting and cinematic. So you’re kind of constantly trying to find the middle ground between reality and it being a good movie-sound experience.”

Where the sound effects will fall in that continuum depends a lot on the specifics of the movie. “Whether you want a car that sits like it was recorded on the set, or if it’s a crazy car chase where it’s a smash cut and the radiator’s in your face, zooming by, and you want to sweeten it with a big lion roar or a jet-by,” says Assells, “the film totally dictates it.”

Sound FX editing has come a long way since the days of Moviolas and razor blades. Today, the editor of choice is Avid Pro Tools, the de facto standard in Hollywood post-production houses and mixing stages. In addition to all of the sonic manipulation that can be done in Pro Tools, sound designers and sound effects editors also use a lot of plug-ins, both of the instrument and processing variety. Because manipulating pitch is so important to sound effects creation, sampler plug-ins such as Native Instruments Kontakt are very popular.

“With the samplers that they have today,” says Harry Cohen, “there are several different modes, where you put the sound on the keyboard and you play it lower and it can be lower pitched and slower, but you can also have it so that you can play it at a lower pitch and it’s not slower, it’s the same speed. You can dynamically change the pitch, you can put it through rough filters and just do whole bunch of different stuff with it.”

Convolution reverbs are also very useful for sound effects. Cohen says he uses Audio Ease Altiverb and Ircam Spat. Jon Title talked about using the convolution reverb built into Kontakt. These reverbs are more often used as sound-design tools, or on internal layers of a multilayered effect, rather then to provide overall ambiences. That task is typically left for the sound mixers. Cohen, Title and Assells all talked about being pretty ginger with their use of reverbs on the predubs they submit to the mixer. “Generally,” says Assells, “I’ll present it two ways since tracks are free. I will pre-reverb something and paste it in there, and I’ll also give them the raw sound so that if a reverb that I’ve created isn’t matching what they think it should be, they could either turn it off or go to my untreated sound and reverb it the way they think it should sound.”

“Ninety-nine percent of the time they like it and they use it,” Cohen says, “but by rendering it separately, I haven’t locked us into anything.”

Chris Assells has cut a lot of automotive sounds in his career and has quite a few tricks up his sleeve for creating more impactful sounds. Here, he explains a couple.

From a sonic standpoint, how do you make a car chase more exciting?
There are several things you can do. For instance, a close-up car-by by itself can sound great and very exciting. But if you sweeten it with a jet-by…Here’s something that works great on motorcycles: You have a recording of a small helicopter “whip pan,” which means the helicopter is coming one way and your microphone is pointing in the opposite direction. As the helicopter comes by, you sweep the microphone in the opposite direction that the helicopter is coming so there’s a real big peak. So you don’t hear it coming that much, and when it’s in your face, it goes “pow” and pushes the air into your face. For some reason, I found on motorcycle chases, you take a smaller helicopter like a Little Bird and you use a whip pan on one of those to make it go by, and it raises the hair on the back of your neck. It works just wonderfully. Jets work very well for close-up car-bys. Sometimes you want to throw some stuff in the sub when it’s right in your face so your pant leg kind of wiggles a little in the wind as it comes by.

Is the whip pan recorded in stereo?
Generally yes, but even a mono recording of it will work because what you want is the dynamic of that thing coming into your face; coming from almost nothing. And again, it’s because the microphone is [initially] pointed away. Let’s say the helicopter is coming from the right, your microphone is pointed to the left. As the helicopter crosses in front of you, you bring your microphone from left to right. And again, so you’re not really hearing it coming that much, but when it peaks, it really pops on the peak.

Do you have any other tips for cutting car sounds?
This took me awhile to learn. This is for a car coming in and stopping. Generally, cars today when they come in and stop [with an automatic transmission], the engine disengages and the car kind of floats in and stops. In that situation, I will frequently use a “reverse in.” And the reason is, you’ve got the tailpipes coming at you, which gives you a sound of the engine instead of just air coming in. Also, if you’re in reverse, the transmission is tied to the engine so that the engine is slowing the car down instead of just this air coming in. When you go to park your car, listen and you’ll see what I mean. When you take your foot off the gas and put it on the brake, the engine disappears. It becomes featureless and boring. I try to get these guys, when they record the car series, to do “in-and-stops” in low gear. That way, the engine is tied to the transmission. The transmission is slowing the car down, and you can actually feel and hear the engine slowing down.

Jon Title was tasked with designing the sound of a massive bridge collapse in the recently released Final Destination 5.

Obviously, a bridge collapse isn’t something you can go out and field record. How did you create those sounds?
The way I did it was to just take it one element at a time. In that case, we have a huge library of sounds, and I’m very lucky to work in a company that has this. You start there. In case of that scene, you start with concrete crumbling and breaking. And metal ripping and groaning. And then you go on, using explosions strategically to help with the power of it. You try to use all the speakers, to make it even more massive, using the LFE or the subwoofer. And you just keep building on it. In the case of that bridge, and with a lot of movies that we’re working on, it’s so CGI-intensive. The visuals are made in a computer and a lot of the time they’re not done until days before we’re finished mixing. So it’s an evolving process of adding and taking away as we continue to work on it.

When the bridge actually falls, I assume you put in a really big roar?
It’s a roar, it’s earthquake-type sounds, it’s cables snapping. The constant sounds would be like earth rumbles and concrete crumbles, punctuated by these explosions and metal snaps and things like that. And then interesting things on top of that: metal groans and moans and higher-frequency screeches that can cut through music.

And a lot of stuff rumbling in the subwoofer, I assume?
Oh yeah.

And what do you use for that?
Sometimes I like to leave that until we’re on the mixing stage. It can be hard to know how a low-frequency sound, especially something huge and sustaining, is going to translate to a big theater. That being said, I think in the case of the bridge, I did cut a low-frequency sweetener for that, which I think was an earthquake rumble that I then wrote the volume on, just to kind of help the arc of the action.

So you automated some volume changes in Pro Tools?
We do a lot of that.

Mike Levine is a New York–area recording musician and music journalist. He’s the former editor of Electronic Musician.