You Gotta Keep 'Em Separated

At some point, the comparisons between films and games are inevitable. I remember my first exposure. It was a Variety ad for the James Bond film Tomorrow
Author:
Publish date:

At some point, the comparisons between films and games are inevitable. I remember my first exposure. It was a Variety ad for the James Bond film Tomorrow Never Dies in 1997, featuring a huge headline that read: “Worldwide Box Office: $175 million, Worldwide Game Sales: $350 million.” Of course, since then every newspaper or magazine article about games mentions that the industry exceeds Hollywood in total revenue, as if evidence that new media has eclipsed old. And all the stories in the trades on game audio production seem to chart its rising sophistication, as if it will someday “grow up” and catch up with Hollywood techniques.

To be sure, film sound and game sound have their similarities. Both disciplines seek to capture the highest-quality sound at the root, often recording multichannel 24-bit in the field or on the stage. Both deal with dialog, music and effects in support of storytelling. And both focus an enormous amount of time and creativity on the edit process, the early decision-making stage of post-production. But once the sounds are ready for the final mix, the worlds diverge dramatically, mainly because in game sound, there never is a “final” mix. As almost everyone associated with games will tell you, “The player is the mixer.”

Imagine a film that containes more than 200,000 lines of dialog, as a recent “shooter” game was rumored to have. Or if each bodyfall in a fight scene had eight different thumps, depending on the “health” of the character. Or if the score had to change seamlessly depending on which door the main character opened in an octagonal room. The sheer amount of file management in a game project is astounding; the number of options to be considered, mind-boggling. This feeling of infinite possibility within a final mix embodies a whole new mindset from the nonlinear sound designer/editor/mixer. The interactive audio engineer.

Today we have noted film composers like John Debney scoring Gods of War II PS3 game, and creative film sound designers like Charles Maynes lending his effects expertise to game titles. Soundelux's Scott Gershin, subject of this month's Q&A, has bridged the two worlds for nigh on two decades now, coming into game sound in the 8-bit, 11kHz MIDI age and now spearheading a group within the Game Audio Network Guild (www.audiogang.org) to develop technology standards and advocate for quality sound.

Don't forget, film has an 80-year history of technologies and techniques. Film has standards for playback and common formats. Videogame production is still in its adolescence, working out the means to put tools (specifically, middleware) in the hands of audio people and ever-more power in the game engines themselves. Large facilities are being built to service game sound, and orchestras are being recorded in Abbey Road for major titles.

Perhaps most important, there are upward of 160 million game consoles throughout the world, and research has suggested that 40 percent of those are hooked up to surround playback systems. Is there something there for the record industry to take a look at?