SFP

Sound for Picture - Ender’s Game: Earth vs. Alien Creatures in Future Space Wars

Ender’s Game, director/screenwriter Gavin Hood’s ambitious and arresting film version of the best-selling 1984 science fiction book of the same name, has deservedly received rave notices for its

Ender’s Game, director/screenwriter Gavin Hood’s ambitious and arresting film version of the best-selling 1984 science fiction book of the same name, has deservedly received rave notices for its imaginative depiction of life on enormous space stations, mock battle exercises in a zero-gravity geodesic sphere and intergalactic wars involving thousands of spaceships. Those are just three of the environments and situations that challenged the sound team. Then there’s also the fiery destruction of a planet, a wild videogame within the story, a prominent insectoid figure, futuristic weapons and vehicles, implications of mental telepathy, and scads of assorted rooms, chambers and atmospheres in space and, briefly, good ol’ planet Earth. This was most assuredly a CGI designer’s playground, but it was also an amazing sound canvas to work upon, too.

Spearheading the sonics on Ender’s Game was supervising sound editor and sound designer Dane Davis, who is perhaps best known for his Oscar-winning (and highly influential) work on The Matrix and its two sequels, but whose long resume includes dozens of feature films in many genres—from Boogie Nights to the two-part Twilight: Breaking Dawn—as well as documentaries, shorts and even videogames. Working in consultation with director Hood and noted film editors Lee Smith and Zach Staenberg, Davis and his sound-editing crew at Danetracks, now on the Warner Bros. lot in Burbank, began their design work long before anything approaching completed visuals was available. Taking their cues from paintings on Hood’s iPad, animatics, the script, and the original book, they spent countless hours conceiving of ways to make every ambience interesting, every action piece somehow believable.

Central to the story is an imminent war between humans and a super-species of large bug-like aliens known as Formics, who had attacked Earth years ago and were beaten back, but are believed to now be gearing up for a full-scale invasion. Davis approached the task by constructing the sounds for the Formic spaceships and the human-piloted ones from different fundamental materials—for the Earth’s vessels, he used metal sounds exclusively, while for the Formic crafts, “I think of their spaceships as being secreted by these insects; not manufactured from materials,” Davis says from Chicago, where he’s been working on Jupiter Ascending, his sixth film for Matrix directors Andy and Lana Wachowski. “They’re exoskeletal shells, and the way they move and the spaceships and the weapons—it all feels like it’s evolved biologically. It’s also the physics of the materials that help define how I approach the sound. It’s part of the storytelling.

“So I thought: ‘How about if the Formic ships are made entirely out of insect sounds?’ By and large, I used single individual insects: a single locust, a couple of single crickets, some grasshoppers. It had to scale—one, 200, 50,000, 200,000—but always still sound organic, like they were living things. So obviously there was a lot of manipulation and a lot of layering. I wanted [the Formic ships] to sound sort of anti-metal—humans tend to use metal for all this stuff; even if it’s a composite material. I made everything with the human ships out of metal sounds, including the thrusters—metal slowed down a lot and modulated and overloaded.”

What are the initial metal sounds coming from? “Just the sound of metal being stimulated or struck,” Davis says. “Like, I recorded a bunch of candlestick holders, pots and pans and wind chimes and then manipulated those sounds in a lot of ways. I would bang a metal candlestick holder that was on the Foley stage really hard with a mallet and then layer that up around 30 times on itself in Pro Tools. I’ll create a longer resonance out of it that’s varying, and not so mathematically simple as a reverberation algorithm, and overload them a little bit to give them more real-space acoustics, and also modulate them. I love using MondoMod, a great old Waves plug-in I use all the time, and I also use SoundToys’ Tremolator, and the GRM Doppler, in particular, to create this virtual acoustic space, where sounds change when they shift in the environment.”

Among the other plug-ins he employed for the sound design were SoundToys’ Decapitator, PSP Audioware Vintage Warmer2, Wave’s Kramer Tape, the Massey TapeHead and Serato Pitch ‘n Time (which he called “indispensible”). He adds, “Outside of Pro Tools, I use several custom programs from Mike Schapiro of Schapiro Audio, who also makes the beautiful Skillet controller I employed extensively for panning and automating plug-ins. For Ender’s Game I used his prototype, but he will be selling Skillets at the end of the year.”

The base metal recordings were made using a variety of microphones into Sound Devices 722 recorders. To best capture what Davis calls the “perturbed upper harmonics” of the struck metal, “We needed mics that are extremely high frequency-sensitive, including the Schoeps XTs, but there are also some others that I’m really proprietary about that go much higher. I always record with several different mics to see what’s up there [in the higher frequencies], because as you’re pitching stuff way down, you want a lot of that. I usually record at 192, and we also had a True [Systems] preamp that’s really clean when we recorded directly into Pro Tools HD.”

Interestingly, for the sound of a planet being incinerated, he relied entirely on human sounds—namely, his own voice, again seriously manipulated. “I use voices a lot,” Davis says. “I wanted it to be a sympathetic sound that wasn’t cold, because it was important for there to be an emotional connection to this. Fire itself is a difficult material to work with [sound-wise]. In movies that use fire, you usually have to use other things to make the fire sound interesting and emotionally evocative. “Also, when you’re on the Eros planet [once a Formic colony], I used human screams to make all the exterior and interior ambiences, because I wanted them to have an emotional, empathetic quality. Before the humans put a command base on this planet, they basically had to kill off all the Formics, and I wanted it to always feel like it’s haunted; like the ghosts of these insects—their enemies—are haunting the people there.”

On the other hand, the deep space battle school where the young hero of the film, Ender Wiggin, joins with an elite group of kids and teens to learn about becoming space commanders, called for a much different treatment. “The school looks shiny and new; it’s not like something out of Alien,” Davis says, “But Gavin wanted to suggest that there’s also this sense that it’s maybe starting to wear out, because time is running out. Maybe the axle at the core of this rotating school is wearing out, so I created ambiences in every room where you sort of hear this deep thud, and even the air-handling life support systems are all looping in the same intervals all the time. We’re trying to subtly suggest that things are not that stable,” which adds to the overall tension of the build-up to war.

Ron Bartlett (dialog/music mixer), Chris Alba (Foley supervisor), Alex Gibson (music editor), Lee Smith (picture editor), Eric Lindemann (sound designer), Dane Davis (sound designer/supervising sound editor), Stephanie Flack (co-supervising sound editor), Matt Kielkopf (first assistant sound editor) and Doug Hemphill (FX mixer).

Ron Bartlett (dialog/music mixer), Chris Alba (Foley supervisor), Alex Gibson (music editor), Lee Smith (picture editor), Eric Lindemann (sound designer), Dane Davis (sound designer/supervising sound editor), Stephanie Flack (co-supervising sound editor), Matt Kielkopf (first assistant sound editor) and Doug Hemphill (FX mixer).

Eric Lindemann, who has worked with Davis since the first Matrix film, was tasked with designing the weapons and explosions, and sound designer/mixer Tom Ozanich handled the FX predubs. Typical of these sorts of CGI-heavy films, the design work didn’t stop when the final mix began, as late-arriving visuals required constant sonic updating. “Our lives are always enslaved by the visual effects schedule,” Davis says, with absolutely no hint of bitterness. “The schedule was even extended six months, but everything still ended up being last-minute. By the time we finally had something onscreen, we had very little time to build these battle scenes from what I had through all the temps to what was now on the screen. It was a huge job. At one point I didn’t leave my studio for six days.”

Ron Bartlett came onboard as re-recording mixer for dialog and music, while Doug Hemphill handled FX. The duo is one of the most respected teams in L.A., having worked together on such recent films as Life of Pi (for which they were Oscar-nominated), Prometheus, Rise of the Planet of the Apes, X-Men: First Class and Sherlock Holmes, along with dozens of others both together and with other mixers. (Next up for the team: The Maze Runner and X-Men: Days of Future Past.) Ender’s Game was mixed on the John Ford post stage at Fox in L.A. using the enormous Neve DFC console and also an Avid Icon.

“This mix was very challenging, and a lot of it had to do with the way films are mixed now,” Hemphill comments. “The way Ron and I mix, our first order of business is you have to hear what the actors are saying. Lee Smith was running the mix, and like us he lets the performance dictate. That is, you’re not sitting there looking at the back of the mixer’s head while he stops constantly and analyzes every detail; you’re rolling forward at 90 feet a minute. So we were constantly moving ahead in real-time mixing.

“We wanted a fast tempo to see what ideas came out of working quickly. I know a lot of people like to work that way now—from Michael Mann to Ridley Scott to Peter Weir; it’s a long list. What that means as mixer is you’re almost watching as an audience member and you want it to sound a certain way. Dane is very good at that and so is Lee—they’re both very instinctual and very quick.”

Bartlett certainly had his hands full keeping the all-important dialog clear in a multitude of unusual settings. “I used every trick in the book,” he says with a chuckle. “I used a lot of different reverbs and IRs [impulse response recordings that match particular ambiences]. There are obviously a lot of different spatial environments to deal with. There’s the giant geodesic dome they battle in, and we had several different things going on there: a delay that would bounce off the walls, and a couple of different reverbs. I would use certain reverbs [on the dialog] when they were closer up, so it gave more of an intimate feel—like when Ender and Bean are talking among themselves, you hear the other kids battling off in the distance with a much longer reverb and more delays to give a lot of perspective. I wanted to make that place seem interesting and huge because it’s so amazing visually. Then, later, when they’re in that huge [battle] simulator room, you’ve got all the admirals and everybody behind the glass watching, and their voices are coming from a P.A., so I wanted them to have a real commanding presence. When Harrison Ford [playing stern Colonel Graff] speaks, it really fills the room. I wanted Ender to feel alone and isolated, and Harrison to have a big, authoritative voice.” What tools would accomplish that? “Mostly plug-ins: [Audio Ease] Speakerphone and Altiverb. I used a P.A. amp and a slapback delay with another reverb on top of that to create that effect.

“I have a whole Pro Tools rig just for processing, and then I send it through the board,” Bartlett continues. “I’ve been doing that for quite a while now, because it takes so much horsepower. I don’t play other tracks off it or do anything else with it. I use a bunch of plug-ins, like the ones I mentioned, plus Waves, Eventide and even some guitar stomp boxes at times.”

One unexpected challenge for Bartlett was dealing with some of the ADR lines of the 15-year-old actor playing Ender, Asa Butterfield: “His voice changed a little over the course of making the film, so when it was time to record his ADR, the looped lines were a little deeper, so we did some pitching and EQing to help match him up.” Butterfield also grew two inches during the film’s production.

Bartlett also notes, “Stephanie Flack was the dialog editor [and co-supervising sound editor, with Davis] on this film, and her attention to detail was amazing. I’ve had the pleasure of working on many films with Stephanie, and she always brings a high level of compassion and craft to her work.” Flack also worked very closely with Foley supervisor Christopher Alba.

Steve Jablonsky’s evocative music score is another key part of the soundtrack, eerily and effectively combining FX-like tones and textures with orchestral passages. “Steve is a fantastic composer,” Bartlett says. “It was a real pleasure mixing his score. He’s one of the few composers out there who really blends the electronic and synth-type sounds with the orchestra. A lot of times you’ll find a score that has a real strong orchestral sense, and then they’ll add a few synth tracks on top, and it doesn’t always blend that well. But Steve really integrates his into the score. Steve is also a master of writing around dialog lines and certain action moments, so he really knows how to sculpt a score. I also must give Alex Gibson, the music editor, a lot of credit for that, as well. He’s one of the best at that—shaping a score and helping his composer out. Alex really helped elevate the film.”

As on all films, the final mix for Ender’s Game was a massive, intense and time-consuming balancing act among dialog, music and FX. Sacrifices are made for the sake of the storytelling; egos are checked at the door for the greater good of the film. That slaved-over effect might be lost under a newly emphasized music cue—and vice-versa, depending on what the director wants to emphasize at a given moment. It’s all about choices.

Take the climactic battle scene for instance. “In a film, a battle is an impression,” Hemphill says. “There’s so much information going by, particularly in a film like this set in space, there’s no way an audience member can process all those visuals and sounds to the nth degree, so you create an impression of what’s going on. That’s what an audience wants. No one wants to feel as though they have all the TV channels on at once and they’re being bombarded. So we were very selective with the FX, interleaved the music—which is gorgeous—and in that scene there’s also a lot of dialog and it’s very important to hear it. We did a pass where we listened to it and Lee said, ‘I want to pump the FX up a little bit, because at this point the audience wants that.’ But the music is still very important all through that, too. Peter Weir once told me, ‘I want the music to fight for its right to be here.’ But Jablonsky always let the actors shine through; he never overwhelmed them.”

Because so many of the settings in Ender’s Game have a hyper-dimensional aspect to them—limitless space, weightless environments, a humongous underground cavern, etc.—it’s a film that lent itself beautifully to a 9.1 Dolby Atmos mix. The galactic battles come alive more when the spaceships can soar above the audience, too, though, Bartlett notes, “Sometimes the more subtle sounds actually work better in that format. We found that the denser the sound field gets, the less you understand where it’s coming from. But I think we did a good job of sorting things out to where it wasn’t overwhelming, because it easily could have been. You have to really take a clear stance on what tells the story best in whatever format you’re mixing for.”

Want to read more stories like this?
Get our Free Newsletter Here!