AS LITTLE AS 10 YEARS AGO, sound and music for games werepractically afterthoughts, usually handled by the same person,generated on MIDI synths and only occasionally using samplers andlibraries for effects, and wedged into the game at the end, usually at8-bit resolution. How things have changed! The video game industry hasbecome a multi-billion-dollar worldwide phenomenon that now rivals (andoften exceeds) the film and record industries when it comes to productvolume, profitable companies and user loyalty. With millions to bemade, the stakes have gotten higher and the pressure more intense tohave the most amazing visuals and the best sound possible on every newgame. Today, audio designers are working in studios rivaling today’stop music facilities in 96 kHz to deliver 44.1k playback. We’ve gonefrom those 8-bit mono footsteps feebly rendered at 11 kHz to soundeffects and music delivered in 5.1 CD-quality audio. With that in mind,Mix brings you this special section, a close look at the audioside of video games.
If you are a serious gamer or, like me, have children or teens inthe house who fit that description, chances are you own a few gamesmade by Electronic Arts (EA), the Redwood City, Calif. — basedsoftware company responsible (along with a number of top developmentpartners) for some of the best-selling sports games (MaddenFootball, NCAA College Football, FIFA Soccer, NBA Live, Tiger Woods PGATour, etc.), simulation games (SimCity and the incrediblypopular series, The Sims), war games (the Battlefield andMedal of Honor series) and games based on films or filmcharacters, from James Bond to Looney Tunes to Harry Potter tothe company’s skyrocketing The Lord of the Rings franchise. Myhousehold spent a lot more on games this past year than on CDs, and EAtook a big chunk of that money. With revenues of $2.5 billion in 2003,a whopping 22 titles that sold more than a million copies each, andoffices in Northern California, London, Tokyo, Austin, Orlando,Vancouver and Montreal that employ some 4,400 employees, EA isdefinitely the industry’s top dog. On a rainy afternoon in earlyDecember, I visited EA’s headquarters to tour its facilities and talkto some members of the audio staff, including the team responsible forthe sound on the recent hit game, The Lord of the Rings: The Returnof the King.
EA made it clear that it was committed to high-quality audio adecade ago when it hired former recording industry giant Murray Allento be the company’s director of audio production. (He has since movedup the ladder to become VP of post-production.) Allen helped equip theaudio rooms in the company’s former facility in nearby Foster City, aswell as the lovely new multibuilding “campus” (as they callit) just south of San Francisco. Now, another studio veteran, FredJones, is EA’s audio facilities director. Like his predecessor, Joneshas his work cut out for him.
THE QUEST FOR HIGH-QUALITY SOUND
Though set in a somewhat anonymous business park, EA’s sprawlingheadquarters is everything one would hope it might be. Aside from suchamenities as a soccer field, full-court indoor basketball gym and alovely restaurant/food court that serves a wide variety of nicelyprepared foods, the campus boasts a number of specially equipped roomswhere games can be played and/or tested, and cubicle after cubicle ofstate-of — the-art computers used for everything from animationto sound design and realization. In The Lord of the Rings wingof one of the nondescript high-rise buildings, artwork, models, fantasygame miniatures and props from the film can be found on the walls andon desks, and the rows of cubicles — quiet the day I visitedbecause the game was finished and work hadn’t quite begun on thenext EA LOTR game — have “street” namespulled from Middle Earth locales. These people clearly dig theirwork.
Down in one of the two adjoining ground-floor studios — aTHX-certified room equipped with a Pro Tools|HD 3 system, a Pro Control(replacing the studio’s former console, an SSL Avant) and two differentmodels of Genelec speakers (1038s in front, 1032s as rear surrounds)— I sit down with Fred Jones, LOTR audio director Don Vecaand lead sound designer Charlie Stockley to talk about how video gamesound design is both similar and quite different from its feature-filmcounterpart. (The other lead sound designer, Paul Gorman, was notavailable that day.) And The Lord of the Rings: The Return of theKing game is particularly apt, as it incorporates extensive footageand sound effects from Peter Jackson’s films amidst the multiple levelsof sophisticated third-person animated game play. In its first month ofrelease alone, the game sold half-a-million copies, and it has beenwidely hailed as one of the best of 2003. With Ringsmania at anall-time high, the game almost couldn’t lose, but its creators —who are also now co-perpetuators of an ongoing LOTR gamefranchise at EA — embrace the task of creating a mind-blowinggame with the same determination and seriousness as Frodo’s quest toreach Mount Doom. Most of the sound work was done on the designers’individual Pro Tools MIXPlus systems in small offices that arescattered throughout the complex, with the big high-end studio beingused later in the process for intensive mixing and surround work.
IT’S LIKE THE MOVIES, BUT…
Whereas a video game can often have a development time of severalyears from conception to release, The Return of the King‘sgestation period was considerably shorter because of the annual releaseof each LOTR film. For marketing reasons, it was paramount tohave the game come out shortly before the film itself, or roughly ayear after The Two Towers, which is a completely different game.“A lot of people might think The Return of the King is asequel to the last game the way the movie is, but to us it wasn’t,because everything was in-house for this game, and we built itcompletely from scratch,” Veca notes.
Veca has been with EA for 12 years. Originally amulti-instrumentalist who went to San Jose State for the university’smusic program, he found himself spending much of his free time in thecollege’s recording studios programming electronic music, andeventually got his degree in computer science. He got a job at nearbyApple Computer, where he worked on early versions of Sound Manager,MIDI Manager and QuickTime, and then was hired by EA to do audio toolsand ad libs programming and authoring.
Everyone agrees that it was the eventual widespread adoption of theCD-ROM format for video games that allowed for the rapid improvement ofgame audio. “It’s still a video game, not an audiogame,” Veca says. “But audio has been slowly catching up,and now we’re getting to the point where we’re expected to sound like amovie. So if we’re expected to sound like a movie, then we have to havethe resources and the horsepower. Fortunately, we’ve had lots ofsupport. Neil Young [executive producer of the game; no relation to therock star] is one of the first executive producers I’ve encountered whoreally understands the value of video game audio and its importance inthe overall game experience. We were the first THX-approved game. Whydid he do that? Because he thought sound was a realpriority.”
On the simplest level, creating sound for video games these days issimilar to working on a Hollywood film: It usually involves originaleffects creation, Foley, the integration of score and, in many cases,voice-over. For The Return of the King, the game’s sound crewhad the added advantage of having actual 5.1 audio from the film andmusic stems of Howard Shore’s score delivered to them. But because thefootage they used from the film in different sections of the game hadto be completely re-edited to fit the game, so did the audio. And,typical of Hollywood films, the finished audio for the movie was notavailable at the time EA was working on the game. As a result, thesoundtrack is a combination of film elements and effects createdspecifically for the game.
“We have this cool starting place,” Veca says. “Weget to use the music, we get to use some of the sound effects, but wehave to chop it up, fill in the holes and create brand-new stuff. Butthen the hardest part, the most important part, is making it play backcorrectly, implementing it correctly.
“The analogy one of the guys who works here has used is, youcan give me a Stradivarius and it will sound like crap, but you cangive Isaac Stern a student model and he’s gonna rip it — it’sgonna sound great! The point being, you can create the coolest samplesand have the most pristine audio, but it’s not going to be worth a damnunless it’s in the game and implemented correctly.” Veca saysthat for this game, he had the authority and tools that he needed toplace the sounds in the game as intended. “It’s a double-edgedsword because, sure, we get the power and we get to make it sound waybetter than normal, but it’s that much more work — implementingis a whole other job.”
Games are, by definition, interactive, so just as the playercontrols the movements of the game’s characters, the sound must reflectthe action, too. A game like The Return of the King might have adozen different levels for a player to navigate through, with eachlevel presenting tasks that the player must complete to move on. Thereare, needless to say, many sound variables that need to be programmedinto the game. This requires another level of technical expertise farremoved from the traditional sound designer’s role.
“Usually, there are game programmers who write the code thatlets you see everything and lets everything happen when ithappens,” Veca says. “Then there are programmers who writethe authoring tools that let an implementer take these assets —our sounds and [film sound designer] Dave Farmer’s sounds — putthem into this game format, massage them, play them back and integratethem into the game. Those are the tools people. In this case, we [thesound team] were implementers for our own audio. We took the rawassets, edited them and formatted them for the game so they’re at theright sample rate or whatever. Then we take those assets and we plugthem into the tools. We add triggers that will reference these assetsand play them back correctly.
“Charlie [Stockley] was in charge of all RAM-based interactivesounds, including all the characters,” Veca continues.“Every character that you see running around, he put the soundwhere it’s supposed to go at exactly the right volume, using tools thatour audio programmer gave us to give us full control. And the coolthing about our audio programmer — Laurent Betbeder — ishe’s a sound designer and he understands what we need, so we can definehow we want things. A lot of it is sequencing: We’ll write aspecification for the program that says, ‘Okay, they want afootstep here. It’s supposed to pick from these three samples,randomize it this much, et cetera.’ So we give it a spec and tellwhen and where it goes in the game, and then, ultimately, we’ll hit theEnter key and it’ll grind away and create a big hunk of data thatincludes the new information in the whole game.” Vecaexplains that this trial-and-error process is getting faster with newtechnology. “In the old days, when I was a programmer, you couldchange some code, hit the Compile button and literally go to lunch andcome back before it was finished. If you made one little mistake, youhad to go through the whole thing again and wait another couple ofhours. But now, like on the effects end, with Pro Tools, you do acrossfade, it isn’t quite right, you do it again and it’s simple. Thatimproves the process dramatically.”
“The sound design is probably 20 or 30 percent of the process,and the implementation is probably a good 70 percent of what we’redoing,” adds Stockley. “It’s kind of like a pre-mix in away. We’re adjusting all these minute elements, like how far are wegoing to hear it if the character walks by that fire, and how does itchange as you get closer to it or farther away? Ultimately, the playeris the mixer, in a way, because we don’t know how close the guyis going to walk to that fire. But we do know that the timbre is goingto change along with the volume as the player walks closer or fartheraway.”
The music, too, is spurred by specific events in the game.“There were six hours of music from the movie, and for thisthree-minute scene, we might want a high-energy fight thing, or for asneaky level something completely different. We’d have to scour thisscore, find this mood, maybe match a key, put it together and make anew piece out of it that you then could say, ‘Now triggerthis when you go into this area.’ More often than not, it is notjust one piece of Howard Shore’s music. It might be a montage thatstays within a particular vibe for a certain amount of time. Then yougo into this other area and you find this other piece of music that’sbeen edited.
“In the old days,” he continues, “audio couldn’tbe streamed from disk. Anything you heard was from RAM and it was verylimited. So you had these tiny little loops, and we used MIDI becauseit used even less RAM because you’re triggering instruments. We did alot of cool interactive things with that, but one day, somebody decidedto take some of that bandwidth that the rest of the gamers were usingfor their video streaming. Now, no one uses MIDI; it’s almostexclusively streaming. Now, not only do we get a stereo music track at44.1 — still compressed, but pretty high-quality — we’realso streaming ambience, so when you go into a battle, you don’t justhave the characters making sounds, you have an ambience track thatsomebody has created in a 5.1 studio. We even have one more stream thatdoubles as streamed sound effects and for VO.” (For The Returnof the King game, Veca went to New Zealand for three weeks torecord the principal actors for game voice-overs — a rareprivilege.)
Even with sound gaining importance in video game design, there arestill major decisions — and concessions — to be made atevery step of the audio process. “There are trade-offs all thetime,” Veca says. “There’s a limited budget for RAM;Charlie’s in charge of that. You’ve got all these characters to workwith — where are we going to use a high-quality sound, where areyou going to use the low-quality sound.”
Stockley adds, “I’ll look at footsteps or some low thunderrumble and I can either sample rate-convert them lower or higher ormassage sounds to fit in my RAM budget. The timbre of the sound and thelength are what I focus on most. If I have breaking ice or glass orsomething, that’s going to be a higher sample rate, and, obviously, Iwant to keep the highs and keep it sounding good. The big huge catapultlaunching with all the wood creaking and everything — I’ll wantthat to be special. But then a footstep or a clothing Foley can bedown-sampled because you’re not going to hear it clearly anyway in amassive battle of 10,000 Orcs [the evil manufactured creatures in theLOTR trilogy].
“Then there were the siege towers. These things are hugepieces of wood and metal and stone and everything else, and there’s noway I could fit that kind of sound in my RAM budget — if weimplemented it as a streamed sound effect, it would step overeverything else and streams would be cut off everywhere — so Ihad to think of a way to make this thing sound huge. So I took littlesnippets of metal and things like that, lowered the pitch and pitcheddown some squeaks to become this groany, metally sound, and then Isequenced maybe 20 or 30 of those sounds so it randomly creaked betweenmetals and little pieces of woods. Then underneath all that, I had aone- or two-second loop of a continuous rumble. I mixed all thattogether inside of a sequencer and came up with a big sound that didn’ttake up too much [space].”
Another factor is the format of the game: Each has its own demands,peculiarities and limitations. These days, everything is geared towardsurround, and because it is the most popular format, “We do ourmain development on the [Sony] PlayStation 2,” Veca says.“But PlayStation doesn’t play 5.1, it plays surround. So does[Nintendo] GameCube. You can get 5.1 from the PlayStation via aDTS software solution, and some of our EA games use a software solutionfor DTS, which is routed out of the digital/optical output. But ittakes too much CPU horsepower for most games — another trade-off.[Microsoft] Xbox does 5.1, so we use 5.1 for our interactive soundeffects. But to stream six discrete channels from disk on that box isjust not feasible for the LOTR games.
“We have to do a lot of planning and say, ‘If this workson this box, what about that box?’ People are expecting this onthis box, and this other box doesn’t do it — what do you do? Sothere’s a lot of pre-production work we have to go through to figureout how we’re going to deal with this. Actually, we have fairly limitedhorsepower on the PlayStation. The audio hardware has been franklylagging behind and, hopefully, the next generation will improve onthat. But our biggest job is trying to fit all this cool stuff intothis little tiny box and make it all sound good.”
Because of the compressed schedule required to make The Return ofthe King, the audio department’s production timetable was, Veca,admits, “brutal. I think in the summertime, there were quite afew 90-hour weeks.” But no one seems to have any regrets.“We learn something new during every game and we get better atwhat we do each time out,” he says. Adds Stockley, a veteran ofsuch game titles as Wing Commander IV and Nuclear Strike,“I’ve been here six years and this game in particular is the bestgame I’ve been on. The team is amazing.”
And they’ll likely all be working together again this year on thenext The Lord of the Rings game. Next up is a game centered onthe battle for Middle Earth, and who knows what other adventures awaiteager game players? Whatever emerges from this EA team — just oneof dozens working all over the world — you can count on the audiobeing more sophisticated and more prominent with each futurerelease.
Blair Jackson is Mix’s senior editor.
The Precious Vocals
The Return of the King audio director Don Veca on recordingcharacter voices for the game:
“I spent three weeks in New Zealand basically ‘campedout,’ waiting for the actors to be available between their moviework. I actually recorded them myself on a Mac laptop system, whichworked out great. For the first Lord of the Rings game I workedon, The Two Towers, I wound up traveling to about five differentstudios around the world to record the actors, each with differentrecording studios and gear. Some of the studios and engineers weregreat, some were not so great. This caused consistency issues, whichwere very difficult to deal with when the dialog was implemented intothe game. I overcame that this time by doing it all with my ownportable rig. I used a hand-picked DPA 4035 headset mic for most of therecording, because most of what we recorded was loud yelling in abattle context and the 4035 is great for this [144dB SPL]. Also, manyof the actors tend to ‘act’ — move around a lot— during their delivery. In fact, Andy Serkis [Gollum] actuallydid the entire four-hour recording session on his hands and knees,constantly jumping around all over the booth! The headset mic totallysaved us there, but the headphones were soaked after the session!
“The actors themselves were pretty fun to work with, too,especially the Hobbits: Elijah Wood [Frodo], Billy Boyd [Pippin] andDominic Monaghan [Merry], who are all avid gamers. As it turned out, Iwas expecting to have to miss my daughter’s tenth birthday due to thethree-week recording schedule, so the Hobbits and Andy each allowed meto record a special birthday ‘audiograph’ for her. Needlessto say, she was thrilled. On top of the three weeks in New Zealand, Ialso had to fly to the UK for Gandalf; the rest were recorded in theU.S. Overall, it was a great experience.”