The first time composer/engineer Steve Wood wrote for orchestra, he had to purchase a one-month pass to his chiropractor. No joke. His back went out before the final sessions for the IMAX film Eureka. The afternoon the recording was complete, his back magically got better.
This orchestra stuff was way different for Wood, who had come from a pop music background and had, most notably, worked as keyboardist and music director for Kenny Loggins for nearly ten years. Prior to that, in 1972 while a member of the Southern California band Honk, Wood met up with Greg MacGillivray when the band was asked to do the soundtrack for the film Five Summer Stories.
“A surf movie is very much like a porno movie in that it doesn’t have a particular plot,” Wood says with a laugh. “But Greg was really into the musical aspect of the film and actually cut the film to the music. We did that film in stereo, too, which, for the time, was quite progressive. He was very interested in the high quality of presentation, even then.”
The innovation of IMAX in the early ’70s appealed to MacGillivray, who, with Jim Freeman, began MacGillivray Freeman Films, specializing in the large-screen format. When working on Speed in 1984, MacGillivray called on Wood, who jumped at the chance for his first large-screen project. It opened new doors and challenges, the orchestra just being one of them.
“If you come from the MIDI world that so many of us come from now, it’s difficult to deal with the huge, unwieldy, mammoth-tusked beast that is an orchestra,” Wood says. Knowing that he needed help, Wood teamed up with Daniel May, whose doctoral degree in composition from Cornell University complemented Wood’s “street” degree perfectly.
“Just the sound of a piano is different,” Wood says. “Raised in the pop music field, you’re used to a bright, super-close-miked instrument. If you play acoustic piano samples on a synthesizer, it’s as bright as all get-out. To this day, they often harden hammers by lacquering them in studio pianos to give them a brightness that cuts through a track. Then you get into orchestral music where the piano has fluffy hammers and is miked at a distance, and it’s a real wide, not direct, sound. You’re so not used to that sound that you can hardly hear the music. It’s the same thing with an orchestra itself. If you’ve only worked with string samples and synthesizers, and you get into the real room, first of all, you don’t know much about the potential for what the instruments can really do. Then they start playing, and they’re not totally in tune. If you’re used to having a tuning machine and a sample that’s in tune, when you start hearing a real orchestra, it’s strange. It’s like the difference between having a real person and a picture in a magazine-it moves in unpredictable ways, it talks back, and it smells-and it takes you a while to get over that.”
For The Living Sea, Sting, an avid environmentalist, contributed music in the form of original masters, which were sweetened (often by replacing sampled instruments with real ones) and remixed for the surround format by mixmeister Terry Nelson. “Sting also sent some home demos of two ideas that had never reached fruition that I arranged for use in the film,” Wood says. “It is very difficult working with the music of someone you respect, utilizing the basic aspects of their music to do something with their music that is pleasing to the director while still maintaining perspective on the sensibility of the original art.
“For this current project, Dolphins, Sting has actually written a theme song specifically for the film. He did a demo of it that had a little bit of a country flavor, and then I did a demo of it with a little more of a calypso feel since the film takes place in the Caribbean. Now they’re doing an animated sequence in the beginning that is basically locked to that tempo. Then Sting got that demo, and he’s going to do another version of it.”
George Harrison’s music was used in Everest, the biggest IMAX film to date, with domestic box office exceeding $70 million. “We didn’t find that the original recordings were appropriate, but some of George’s haunting melodies were perfect thematic material,” Wood explains. “In fairly typical fashion, Dan and I spotted the film with Greg MacGillivray and co-writer/editor Steve Judsen. Of course they have their ideas of what they wanted the music to convey, and then Daniel and I decided who wanted to tackle a specific cue and whether it should incorporate a Harrison theme or be an original piece.”
Wood writes primarily at a Laguna Beach acoustic recording space designed for him by Chris Pelonis in 1994. “Daniel and I have almost identical equipment in our studios, including the ‘I’m cool’ tube and analog stuff. But because of the matching Mackie digital 8-buses, Mac-based sequencers, and Roland JV-1080 and Akai S5000 samplers, we were able to make rough sketches alone at one studio, get together and hone them further, and then play them for Greg and make any necessary changes. Daniel did many of the final orchestrations with the remainder covered by Bill Boston. If a cue was heavily ethnic or pop in style, we laid down scratch synth tracks. Otherwise, Daniel conducted the orchestra, while I covered click in the booth. Then I orchestrated the overdubs and did the final mix.”
The differences between IMAX theaters and regular movie houses dictate some of the music-making process and recording, Wood has found. First, the room is much larger, which leads into the second big difference-there is a greater distance between speakers. “A regular theater is rectangle-shaped with the narrow side toward the screen,” Wood says. “An IMAX theater is sometimes wider than it is deep, so some of the basic things you would tend to do that would sound nice and rhythmic in your studio will sound crippled in IMAX. If you take a rhythmic ostinato and pan it around your room, every eighth or 16th note in a different speaker in a regular near-field setting, it’s really cool. You hear it distinctly coming from every speaker, and it doesn’t mess with the rhythm. But if you get into an IMAX theater, you could be in the back corner, and you may be 100 feet from one speaker and ten feet from another, which changes everything.
“I found you have to be careful about sustained low end,” he continues. “If you really want to give people a feeling of low end, it’s better to do it with more percussive elements, like a bass drum, that come and go so if you do have an over-abundance of low end in the theater, it’s short-lived and becomes more of an effect. If you do it with sustained bass, it can ruin the whole piece.
“In Everest, for example, the only way for the music to not get turned down when the avalanche came down was to not have any low end,” he explains, echoing a similar conflict between music and effects that takes place regularly on dub stages in traditional Hollywood. “If you have a lot of low end in the cue, it conflicts with what’s really going to be important in the scene, which is the sound of the snow. If you want your music to stay up, you have to think about what aspect of the sound effect is really the crucial element, giving the feeling of what is going on. In that case, it’s going to be the tremendous rumbling sound, so I try to do it with cymbals and things that are going to give high-end energy to the scene but won’t conflict with the effect.”
The final mix for Everest (for which Wood, May and Harrison received the Maxi Award for best score) was done in an actual IMAX theater in Irvine, Calif. “It was a large orchestra, which we did in a huge hall-60 feet high, 60 feet wide and 120 feet deep,” he says. “Scoring engineer Steve Smith put mics way back in the room, which I panned to the rear in the final mix, and it really made for beautiful surround sound that you could never achieve with any electronic reverb. They wanted this giant sweep that you could only get from an orchestra to match those beautiful shots of Everest. I had ethnic musicians, too-Tibetan monks, who were on tour in Long Beach. Originally, we went into EFX with Ken Teaney, who has done the final mixes on all our films. It’s a nice, big room, but it’s not an IMAX theater. We did what we call stems where you take your music and make a music stem, which is all the music combined. I’ll have a 6-channel mix, and sometimes the music overlaps, so I really need 12 channels for the final mixdown. Then they take that and mix it down to six. The sound effects guys may have 24 channels, and that’s mixed down to six. We took the surround stems into the theater and made the final mix of the 6-channel stems-the final blending and EQ.”
Again, an IMAX mix is treated differently from a feature film mix. “When you’re doing a feature film, everything is front-oriented, toward the screen, and you use the surrounds for an effect or maybe an ambient quality, but you’d never think of featuring things in the back. One of the things I do quite often is a technique called antiphony, which was the rage back in the Middle Ages with monks sitting in the front and rear of a cathedral, conducting question-and-answer type stuff. There have been a few situations where I’ve recorded the brass and the strings at different times, and I put the strings in front and the brass in the back, and they would do question-and-answer. When you get in a real large theater like that, the location of the speakers is not completely distinct because it’s such a big room. You automatically have somewhat of a general reverb on everything because it really does have its own reverb in the room. If you’re really aiming for it, though, you can get a pretty distinct front-back image. If the scene is really focused on the front, I won’t necessarily use the rears for anything real attention-grabbing.
“In The Living Sea, there are jellyfish that have lived in this particular lake in Palau for a million years. They’ve developed their own genetic form, and they don’t sting, because there are no predators in the lake. Every morning they swim across the lake, following the sun, and then at night, they dive back to the bottom and return to the other side. In one scene, we’re underwater and we see them approaching in the distance, but we can’t quite figure out what they are, so I focused the music on the front. Then as they start to come into focus, I start throwing a kind of strange, effected delay to the rear that gives us the feeling that they’re coming toward us and that there’s something a little eerie about something behind us. As the diver starts swimming through them, I make the music an entire surround piece, where we don’t know whether it’s coming from the front or the back. As soon as they started surrounding us, I had the string players play a con legno pattern, where they take their bows and turn them upside down, and beating on the string; they randomly hit different notes. It created a dense, eerie bristling effect that you can’t do with synthesizers.
Project studio technology has changed a great deal since his first IMAX project in 1984. “I didn’t have the capability on the first one to actually lock to picture,” he recalls. “Some people obviously did at the time, but I was working with a 2-inch 24-track, and I believe I had an old sequencer that ran on a sync tone. I don’t know how the hell it ended up all synching together. I think the biggest change is SMPTE and project studio equipment that all works together, speaking MIDI Time Code and SMPTE. Now having the digital 8-bus is ridiculous as far as flexibility, memory and the potential for what you can do. The ironic thing is you get all these things to save time, and none of them does, because they just give you more options. The time you used to spend just trying to make it work is now spent making it work in different ways. Nothing saves time; it just adds possibilities,” he says with a laugh.