The familiar wooden roller coaster continues to have its enthusiasts, but for many of today’s thrill junkies, wind on the face and the clatter of a car on the tracks just isn’t enough. Today’s theme park attractions and “location-based entertainment” tend to be increasingly high-tech affairs, harnessing sophisticated motion, visual and audio systems into a coordinated attack on the riders’ senses. Because sound is a crucial component of the overall experience, we decided to take a look at the art of “special venue audio” through the lens of two high-profile projects.
The tracks for Sahara SpeedWorld in the Sahara Hotel & Casino in Las Vegas were created by Dimension Audio of North Hollywood, Calif., to convince players that they are speeding along a racetrack at 220 miles per hour. Meanwhile, “The Amazing Adventures of Spider-Man” is set to open this summer on Marvel Super Hero Island in the new theme park, Universal Studios Islands of Adventure, at Universal Studios Escape in Orlando, Fla. Audio for the attraction was handled by Soundelux Showorks of Orlando. Each setting posed its own challenges in sound design and delivery; read on to see how the production team for each approached delivering maximum thrill-power to the “guests” of their respective attractions.
THE ILLUSION OF SPEED: VIRTUAL RACING AT SAHARA SPEEDWORLDHow do you make people feel as if they are hurtling down a race track at 220 miles per hour when they are actually sitting in a darkened room? And how do you create the illusion that they are not simply alone on the track, but racing against 23 other drivers? At Sahara SpeedWorld in Las Vegas, the answer lies in replacing engine horsepower with computing power, surrounding the driver with an exhaustive collection of real sounds from the track, and continuously updating the sound field based on the positions of other drivers in the virtual race. When he tackled the job for Dimension Audio, sound designer Alan Howarth found out that this is much easier said than done. “It’s one thing to create realistic playback for picture,” Howarth says, “but when you add the fact that you need real-time audio response to the interaction of 24 cars, the complexity goes up enormously.”
The Sahara SpeedWorld attraction, created by Illusion Inc., is designed to replicate as closely as possible the feeling of being in the cockpit of a real race car on the track. To participate in a virtual race, a driver climbs into a 31/44-size replica designed to emulate Indianapolis 500 cars.
Each of the 24 cars lives in its own stall with its own video projection, lighting and 16-channel audio playback array, not to mention vents in the dash blowing air in the driver’s face. “Each vehicle uses five PCs-off-the-shelf 200MHz Pentium II machines-that create the illusion of racing,” Howarth says. “Three are dedicated to animated 3-D graphics, driving three video projectors that are on a 133-degree screen that goes out to your peripheral vision. One PC is dedicated to operating the six-piston hydraulic motion platform, which translates racing data into realistic G forces. And one PC is dedicated to interpreting the racing data to control audio playback.”
Dimension, which specializes in audio for location-based entertainment, was hired by Illusion Inc., the vendor/installer contracted by Sahara to create the attraction. “Illusion approached Dimension because of the Taylor Array Processor System patented by Dimension’s owner, Steve Taylor,” Howarth says. In the same way that each pixel in a video display sums to make a complete image, and the number of pixels translates into resolution, the Taylor Array Processor System (TAPS) uses multiple speakers in a deformable planar array to create a 3-D sound image. “The disbursement of sound energy across multiple speaker channels gives phenomenal 3D audio playback,” explains Howarth.
“The TAPS array is all around you when you are in the vehicle,” Howarth continues. “For each vehicle, you have eight audio channels dedicated to your own car: Two channels are front left/right, two channels are mid-car, two are rear left/right, one is dedicated specifically to the engine, and the eighth is dedicated to two ‘bass shakers,’ one in the seat and one attached to the steering wheel, giving you audio interpreted as vibration.”
The remaining eight channels, fed to speakers mounted on the wall of the bay, are dedicated to the virtual world of the driver’s surroundings, including the four nearest cars in the race. A host network constantly polls the status of each car. The engine speed, RPM, gear shift, tire squeals and even collisions of other cars are interpreted for the relative locations and used to determine the sound in each player’s “immersive” (off-car) audio channels. The software for translating the racing data into audio playback control information was written in DOS by Dimension’s Carlton Blake.
“Let’s say your car is going 100 miles per hour, and another car is approaching you from the left rear at 120,” Howarth explains. “There will be a Doppler shift between the cars, as well as a real-time pan from the back left channel, across the mid-channel of the wall and out through the front channel, all with the appropriate volume envelope fading in from the distance and back out.”
To gather the raw material needed to create the composite sound field, Howarth spent a good deal of time at the track, capturing both the onboard sounds of individual cars and the sounds of one car passing another. “We strapped DAT machines on these Indy cars, and had real Indy racers drive around at 200 miles an hour, three DAT machines on each car, because the TAPS theory is to record using microphone positions that correspond with the speaker placement. So we placed the mics where there would be speakers in the vehicle: two in the dash, two on the driver, two in the engine compartment, one near the transmission and one near the exhaust manifold. We would just start all the machines, then seal them up and have the guy drive around for 45 minutes, and then we would see what we got.”
The SPL was so high in the higher gears that they needed substantial experimentation with padding and microphone selection to avoid severely overloading the DATs. “We used condensers in the cab,” Howarth says, “but ultimately for the successful recordings in the engine compartment we used dynamic microphones-Sennheiser 421s. There was so much voltage coming off those mics that we plugged them without pads directly into the line inputs of the DAT machines.”
Working at the time trials of the Disney World 200, Howarth even managed-inadvertently-to collect authentic crash sounds. “My machines were strapped onto a car that was in a wreck,” he says. “The tapes did survive, but I lost a DAT machine and a pair of Sennheisers. I was recording from the sidelines, so I did get the collisions on a pair of shotguns. But the collision sounds are actually surprisingly undramatic, because the engines shut down at impact to prevent fires.”
After the field work, Howarth massaged the raw recordings into shape for delivery, choosing takes and creating loops for continuous sounds such as engine noise. The delivery medium for the sounds on-site is RAM-based sample playback. “Given the number of sound layers that need to fire simultaneously,” he says, “playing from a hard drive was not effective.” After a survey of available sample playback units, Howarth chose the Emulator 6400 from E-mu Systems. (“It has 128 megabytes of RAM for sample storage, and support for 128 voices going to 16 discrete outputs,” he explains.)
Using a sample-based approach also allowed Howarth to take MIDI controller data such as amplitude and pitch and put it to work in modeling realistic simulations of dynamic events such as acceleration and deceleration. “Initially we thought of using MIDI as the interface to the samplers, because that is the standard interface,” Howarth recalls. “But we realized that MIDI was way too slow for this degree of complexity. So E-mu brought to our attention an emerging protocol called SMIDI, which is MIDI over SCSI. The data exchange is 100 times faster than MIDI. And there was at least the kernel of technology in the Emulators themselves to allow them to be controlled by this SMIDI data coming in through their SCSI ports.”
Initial mockups of the complete system revealed delays caused by the system generating too much MIDI controller data for even SMIDI to handle. But by thinning out the MIDI data, and thoroughly tweaking all aspects of the attraction to wring out bottlenecks, the team was able to get the ride up to speed in time for a $1 million race involving celebrities and real Indy drivers that was scheduled for opening day. “We were on 24/7 for the last two weeks,” Howarth says, “camping out at the Sahara.”
The payoff for Howarth and the rest of the team came at the end of the opening day, when they turned to the real Indy drivers for feedback on the day’s experience. “They talked to us just like I had heard them talk to their pit crews when I had been at the track,” Howarth recalls. “They talked about the way the thing handled, the acceleration, the power. They bought into the simulation so much that as far as they were concerned these were real cars. For us, that was a huge win.”
MIXING SPIDER-MAN: “ON-THE-FLY”-LITERALLY”The Amazing Adventures of Spider-Man” is designed to draw the rider into the middle of a multisensory experience as they move through the ride. In this case the experience, set in New York City, centers on a battle between the arachnid hero and evil foes who attempt to make off with the Statue of Liberty. Describing the ride, director Scott Trowbridge says, “It is one continuous four-and-a-half-minute experience. We start at the Daily Bugle building, and we get sent off into the night to help solve this crime. In the course of our journey, we are outside on the streets of New York, inside warehouses, down in the sewers, in Times Square, and flying above the city through the skyscrapers.
“We knew right off the bat that this attraction would involve an audio/acoustic experience that had never been done before,” Trowbridge continues. “It is very complex in terms of how the different pieces of the audio system all talk to each other and interact. And the structure itself is not just a single box; there is actually a series of different acoustic environments-from sewer tunnels to up in the open air-all of which need to be dealt with as part of an integrated system to deliver a completely seamless and believable experience to the ears of each of the guests.”
To handle the complexities of sound design, scoring and playback system design, Universal turned to Soundelux Showorks. “In most production for film and video,” Trowbridge explains, “the sound design and the hardware design are handled completely separately, because you are mixing to play in 50,000 theaters, all of which are going to have different sound systems and acoustic environments. But since we are building the environment specifically for the sound design, we can make sure that they complement each other. We went to Soundelux because we were confident that they had a very strong understanding of the hardware design working hand-in-hand with the sound design and music composition.”
The Spider-Man ride has been in planning, construction and production for four-and-a-half years. Soundelux was hired about two years ago. “We brought them in early on,” Trowbridge says, “to help us during the planning stages in thinking about the timing and overall pacing of the experience. We are dealing with some very complex timing issues involving movement of a ride vehicle and synchronization of lots of film elements, special effects and other large, moving objects. We had Soundelux develop a scratch track-‘animatic’ is what we call it-which is a very rough blocking out of what the attraction would be. It’s almost like a radio play that we could use as a timing reference. So from the time we started getting serious about production, audio was the master to which everything else was slaved.”
Primary responsibility at Soundelux fell to sound designer, composer and mixer Pete Lehman, who worked with senior project manager Scott Mosteller to meet Universal’s requirements. “At first,” says Lehman, “Universal just asked us to help mock up a couple of scenes to see if their ideas were going to work the way they thought. They delivered us picture for the scenes, and I designed the sound, which included dialog and effects, not much music. Then we went into a warehouse they had set up with a couple of screens and a prototype of the ride vehicle. We brought in Tascam DA-88s and a Yamaha 02R and a bunch of prepared material, and we sat in the vehicle to mix sound for these scenes. We tested a lot of different things both creatively and in terms of hardware, such as the actual speaker components and the placement of the playback system. We were also testing 3-D imaging of audio-how to get things to image as if they are onscreen instead of in the car. It’s hard to do much with processing because the vehicle holds 12 people, and that kind of processing is so sweet-spot dependent.”
This proof-of-concept stage, Trowbridge says, influenced the final design of the vehicle, the sound delivery configuration and the building interior in terms of the control of crosstalk from one sound environment to another. “There are sound barriers between some of these spaces,” he says, “and even some doors that open and close specifically to isolate one space from another. It’s your adventure, so we don’t want you hearing what the cars behind or ahead of you are doing.”
The final configuration chosen for playback involves both on-vehicle and off-vehicle speaker systems. These are used to open the sound space out in the direction of action taking place around the riders or onscreen, as well as to support additional effects. “You have a playback system on the vehicle,” says Soundelux chief engineer Travis Meck, “and a playback system for each location in the venue that the vehicle moves through. So you are hearing a synchronized combination of audio in the car and audio in the venue.” The configuration of the systems varies from scene to scene, with up to 14 discrete channels of audio used in some settings.
The components and locations of the offboard system were specified and installed by Soundelux’s systems division, based on results from the proof-of-concept. “We gave them an idea of what we thought would work as far as different types of drivers and locations,” Meck says, “and then they continued the design in conjunction with Universal.” There are about 200 speaker locations throughout the building. Routing, delays, and playback channel compression and EQ are handled by a Media Matrix MainShow system, a computerized DSP-based live-sound processor.
As for the onboard speakers, Trowbridge says they are “in a known location relative to the guests’ ears, so we can create some pretty fantastic effects. We took a system from Richmond Systems and developed it further for our own onboard purposes. It does ten channels of playback with routing, mixing and delay on-the-fly.”
While the playback system was under development, Lehman was working with the script designing sounds for additional scenes and for events that do not have accompanying picture. “I would nail down the sound design,” he says, “do reviews with Scott to be sure that Universal was happy with the creative direction, and then update the scratch track, making it into more of a final show track. We worked with temp dialog, which we just recently replaced with final. And the final effects have been created on and off over the course of the last year. I am still finalizing some of that now in preparation for our final mix in January.”
One consideration Lehman faced in sound design and composition was that a real-time physical event like a ride is not timed as reliably as a soundtrack is to a film. “It’s not a TV show, where you press play and it runs for half an hour,” Trowbridge says. “There are times when things may go awry. The sound system and sound design have to take into consideration the possibility that the ride vehicle might pause someplace because there is a problem elsewhere in the ride, and we don’t want your experience to go dead when that happens. So we have built in some extra sound for those times, as well.”
Because the audio is being created for a specific environment, the final mix will take place on the ride itself, where the sound system has already been installed. Lehman estimates that the three-week mix will involve about 20 locations on the ride. To prepare for the final, a one-week technical mockup was held on site last fall. Universal’s safety concerns about extensive cable harnesses trailing the vehicle led Meck to investigate the possibility of a wireless mixing solution. As it happened, the Digidesign ProControl console control surface was just making its debut and turned out to be the solution Meck was looking for. “It’s a mixing console,” he says, “but it still gives us the flexibility of being able to deal with elements on a track-by-track basis without worrying about predubs to external decks. If there is something Pete needs to change, we are not stuck with going back to premixes; we just change it right there as we go along.”
The ProControl is used with the Pro Tools 24 MIX Plus system. “Our mix rig is a 64-track Pro Tools system with 24 channels of I/O and major DSP power,” Lehman says, “with all the plug-ins that we can get our hands on. We use wireless Ethernet for the ProControl, so we can plop it down into the mix vehicle on a little wooden platform that we built. The Ethernet communicates commands to the Pro Tools system in a rack room on the other side of the venue. Then the actual mixed audio for the onboard sound comes back wireless for monitoring, while the offboard audio comes back to the speakers through the regular wiring patched in the machine room.”
The attraction entails a continuous series of audio events, but because the channel configuration and offboard speaker layout varies tremendously from place to place, the final product is not one long mix but rather more than a dozen shorter pieces. “We end up with literally hundreds of .WAV files,” Lehman says, “which are each loaded into several Anatek hard disk playback devices.” Timecode is not used to synchronize playback of the files because Universal needs to allow some leeway for variations in the movements of the vehicles and the attraction’s physical objects, many of which are huge and cumbersome. Instead, a sensor-based ride control system relays the location of these vehicles and objects to an overall control system, which in turn uses trigger devices to play back audio, lights and effects.
“As with any attraction like this,” Trowbridge says, “the audio and music are really the glue that binds all the different elements together and smooths over the seams to create a complete experience. Once our work is done, we’ll open the doors, bring in some test audiences and make sure the attraction is delivering the kind of experience we want to deliver. Then we all walk away and go to Tahiti.”