Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Star Wars Episode II: Attack of the Clones

Other films may have bigger budgets, bigger stars. They may even have bigger opening weekends. But no film series has gripped the imagination of moviegoers worldwide, decade after decade, like George Lucas’s Star Wars, which accounts for four of the 13 most popular films of all time.

Other films may have bigger budgets, bigger stars. They may evenhave bigger opening weekends. But no film series has gripped theimagination of moviegoers worldwide, decade after decade, like GeorgeLucas’s Star Wars, which accounts for four of the 13 mostpopular films of all time.

While the latest installment would have the attention of moviegoerswere it shot on Super 16mm film, this summer’s Star Wars:Episode II Attack of the Clones will forever be known as thefilm, er, movie, that brought stem-to-stern digital cinema to thepublic’s mind. It’s not the first film to be shot onhigh-definition tape, not the first to feature totally digitalcharacters, and not the first to be exhibited widely in theatersdigitally. Then again, neither was Star Wars (1977) the firstfilm mixed in Dolby Stereo or the first to use motion-controlphotography. It only feels that way in our collective memory.

Work started on Episode II while its predecessor (Episode I: ThePhantom Menace) was still in theaters back in 1999, and it followeda similar schedule–long pre-production, followed by a three-monthshoot, followed by a 20-month post-production schedule, which includedplanned reshoots. Back again with Lucas behind the camera were producerRick McCallum, composer John Williams, re-recording mixer Gary Rydstromand sound designer/picture editor Ben Burtt.

PRODUCTION

Although the previous four Star Wars films have usedsoundstages in England as their home base, this time McCallum and Lucasassembled their forces at the new Fox Studios Australia in Sydney.

Comments from Lucasfilm that Episode II would be shot onhigh-definition video notwithstanding, there were Panavision 35mm filmcameras in Australia as backup. However, after much testing, it wasdecided before the start of principal photography in June 2000 to shooton 24p high-definition tape utilizing Sony/Panavision CineAlta HDW-F900cameras. And, indeed, not a frame of film was shot.

CineAlta cameras record in the 16×9 HDCAM format (1,920×1,080),although the image was cropped to a 2.40:1 widescreen ratio. The areaabove and below the 2.40 extraction area was available for Lucas toreframe the picture as necessary in post-production.

During shooting, they would record two tapes for each hi-def camera(and most of the time there were two), one in the camera and one in adeck. Clones were made during lunch and at the end of the day,resulting in triple redundancy. A Digital Betacam downconversion, forthe picture department to load into their Avid, was made at the sametime as the clone.

The disadvantage of filming high-definition, or at least the waythat it was implemented on Episode II, was the amount ofnoise-producing equipment added to the set: plasma screen monitors,HDCAM decks, hard drives, associated video testing gear, etc.Supervising sound editor Matthew Wood says, “There was a constantdrone. I brought [dialog re-recording mixer] Michael Semanick in earlyto see how much of the noise we could get rid of. Most of it we could,but sometimes we had to loop lines because the noise was dynamic andbroadband. As the 24p technology evolves, I am sure it will become morelike a broadcast HD setup, with the equipment in trucks parked outsidethe set.”

PRODUCTION SOUND FOR HI-DEF

Wood decided from the start that he wanted to do the whole show in24-bit, including the dialog and sound effects units. (Episode I wasmixed from 16-bit sources, except for the music.) This raised manyissues, based mainly on the fact that Avids do not support 24-bit audioand that they would be shooting at the 23.98 frames per second rate.(The 23.98 rate allows for all of production and post-production tooccur on the same “timebase”: A minute during shooting would be thesame length as a minute during sound editing or mixing to NTSCpicture.) In addition, Wood investigated file formats (Broadcast .WAV,.AIFF or Sound Designer II), the handling of file metadata and how toimport the data into Pro Tools.

After testing various 24-bit nonlinear field recorders, Wood decidedon the Zaxcom Deva II, which was used in conjunction with a ZaxcomCameo digital mixer.

Production sound was recorded by Paul “Salty” Brincat, best known toMix readers for his work on The Thin Red Line. In order to provide alevel of comfort and backup, he actually used two complete productionsystems–his old faithful Audio Developments 409 mixer feeding aFostex PD-2 timecode DAT deck, plus the Zaxcom combination. The rateson both decks were 44.1 kHz, while the Deva was recording 24-bit .BWFfiles. The master clock for the film was provided by ILM, whose 29.97fps nondrop timecode was used to jam both recorders and the Denecketimecode slates.

The 409 provided microphone powering and preamplification, withpost-fader direct outputs on all faders feeding the eight inputs onCameo. This enabled Brincat to separately bus and combine microphonesacross the Deva’s four tracks. The 2-track mix from his 409 fedthe high-def video cameras and decks and the Fostex. The Deva woulddownload to the backup DVD-RAM during breaks, a process that Brincatsays went smoothly.

Brincat notes that the noises on the set, including wind machinesand mechanical devices, made it “a difficult film to record sound on.George shooting wide and tight [simultaneously] makes it almostimpossible to use boom microphones. You can only do so much beforegetting in the way of the picture.”

His primary stage microphones were Sennheiser MKH-50s and 60s, whileon exteriors he used trusty Sennheiser 416s or 816s. Working withBrincat were his boom operator, Rod Conder, and Ben Lindell, cableman.

The decision to record 44.1 kHz during production allowed Wood toimport the 24-bit Broadcast .WAV files from the Deva’s DVD-RAMbackup without any sample rate conversion. Brincat’s Cameo mixerwould encode disk name and scene/take numbers into the metadata, andGallery Software’s Sample Search enabled Wood to export theBroadcast .WAV files from DVD-RAM to Sound Designer II files in ProTools, using the metadata as file names.

Because picture assistants used the same nomenclature for takes inthe Avid, when Wood received EDLs from the picture department, it waseasy to link the 16-bit file digitized into the Avid from the DigitalBetacam downconversion and the 24-bit file in Pro Tools. Wood and firstassistant sound editor Coya Elliott created a Filemaker Pro database tolink the two. First they gearboxed the Avid files into Pro Tools (fromthe 16/47.952 standard to 24/44.1), phased them, and then trimmed andAudioSuite-duplicated the Deva files in Pro Tools so that they were thesame length. When the sound department would receive OMF compositionsfrom the picture department, Wood and Elliott used Gallery’sSession Browser to re-link, using the Filemaker database. ADRrecordings underwent the same treatment, so that any editing by Burttin the Avid could be traced back to the 24-bit Pro Tools files.

PICTURE EDITING

Ben Burtt’s assistant picture editor Todd Busch went back toSkywalker Ranch from Sydney in September 2000, while productioncontinued in Italy, Spain, England and Tunisia. Joining Busch on thecrew were Jett Sally, who had worked on Episode I, and Cheryl Nardi,who had met Burtt on the Young Indiana Jones Chronicles. The Episode IIAvid Film Composers were in the Main House at Skywalker Ranch. Theyshared cuts with the visual effects division, Industrial Light &Magic, by consolidating media to removable drives, or via shared drivesover the Ranch T3 network.

Outside of Burtt’s obvious contribution to the Star Warsfilms via his sound effects, Busch notes that “it’s so hard toimagine anyone else cutting picture. George and Ben have a uniquerelationship. They’ve been at this for over 25 years. When theystart working on a scene, Ben references his huge library of movies toassemble the various scenarios George comes up with. When I go back andlook at Ben’s first pass on the speeder chase or the platformfight or the arena battle, it’s like getting a little lesson infilm history.” (Lucas first used this technique 26 years ago whenmaking what we now refer to as Episode IV, cutting World War IIdogfight footage for the big aerial battle over the Death Star.)

The first big screening of the film, with the production track plustemp music and a few rough sound effects, was in March 2001 and markedthe turnover of picture to ILM in anticipation of first reshoots inApril. As cut reels were received, the ILM editors would decompose thereels and then make an EDL for each track, which was then imported intoa FileMaker Pro database that produces a master list of all necessaryplates. ILM would then digitize the shots, with eight-frame handles,from the master HDCAM tapes. (In the process, according to Lucasfilmtechnical director Mike Blanchard, they were going from the 8-bit YUVHDCAM color space to the RGB 10-bit color space preferred by ILM.)These resulting files took up approximately 7.9 megabytes/frame on theILM server. Every sequence in the movie was given a three-characterletter code (which has been standard on all Star Wars films),and each shot in a sequence was assigned a three-digit number. ILMadded a five-digit “PTS” (Production Tracking System) number that wasunique to every shot in the movie.

There was one area where the new technology had little effect:picture changes. Burtt says that there were just as many changes as inthe past: “George has never limited himself to saying ‘thepicture’s locked.’ We had to lock the picture weeks earlierin order to have the worldwide release in every language. The time ittakes to finally put it on film is a tedious process. [See “Getting theImage to Theaters,” p. 88.] The fact that it was shot digitallyaffected primarily the production and exhibition process, but notediting.”

The first four reels were locked by Christmas 2001; Burtt says thatrecutting of them was only done for technical considerations–”ashot might not have looked good, or a special effect wasn’tworking well.” Indeed, every shot in Episode II is a special effectsshot, in that some amount of compositing or repositioning was done tothe original production footage.

SOUND EDITING

Burtt started work in early 2000, and when he finished in mid-April2002, he had been on the film for 26 months. In addition to pictureediting and sound design, he also had his hands full directingsecond-unit photography. As early as the previsualization in early2000, shooting and putting sequences together on videotape, Burtt was”always thinking of sound. There was a period of about a few weeksprior to going to Sydney when I put together a library that I wanted touse. I went there with a few CDs of sounds, and even back then therewere a few scenes that I cut back in Sydney to which I added music andsound effects. Being a Star Wars film, it was best to evaluateit as a movie. Sound was never out of the picture.”

Eventually, Burtt turned his attention full-time to creating thesound, though he admits that he did less sound editing personally thanon previous movies, giving more latitude to his editors on the spottingof scenes. “In the past, I might have really specified to the editorseach laser hit and each explosion; here, I tended to work more ongiving them menus to choose what they liked from this set of materials.They would then go through the library and make choices that they wouldaudition for me.”

Throughout his 27-year involvement with the Star Wars films,Burtt has been depicted to the public recording sound effects, fromstriking high-tension wires in the mid-‘70s to moving an electricrazor in a bowl for Episode I on the TV show 60 Minutes. He saysthat “those examples are harder to come by on this film because Ididn’t record or create as many things that were relatively simpleexamples of what you can do at home in your kitchen! Much of what Imade was complicated composites on the [Symbolic Sound] Kyma andon the [SampleCell] keyboard–techno-based rather than theold tabletop of sound effects devices.” (See “Ben Burtt on SoundDesign,” below.)

Having said this, Burtt does note that much of the Zam speeders, inthe reel 1 chase in nighttime Coruscant, were made from musicalinstruments, including electric guitars, cellos and violas. Theinfamous electric razor was also brought into play to vibrate viola,harp and bass strings. “I was thinking that it was travelingmagnetically, it was being pulled along the streets with changingmagnetic fields rather than by self-propulsion.”

Because Burtt was in the “danger zone” of making tonal sound effectsfor the speeders, he had to be careful of the interplay with JohnWilliams’ music. “I originally did a temp version of that mix,using nothing but musical sounds for the speeders. My thought was thatthe music score would be percussion-based, along with tones for theships. I temped it that way, but John Williams didn’t quite dothat, and his heavy orchestral piece necessitated rethinking the tonalaspects of the vehicles. In some cases, the musical tones that I madeconflicted with the orchestra. Which was a disappointment for me,because I wasn’t able to push it into a new area. My reasoning wasthat we’ve done an awful lot of high-energy chase scenes, and Iwanted this to be offbeat and strange. But it didn’t reallyhappen.”

During this project, Burtt went back to original Star Warslibrary and redigitized some of it yet again, this time at 24-bitresolution. Although Skywalker Sound has upgraded the facility to ashared FibreChannel system in which sounds are pulled from acentralized server both in edit rooms and on mix stages, Wood and Burttorganized Episode II editorial around “sneakernet” local drives,primarily for security purposes.

PRE-DUBS

In December, re-recording mixer Semanick took a look at the initialdialog tracks on a mix stage at Skywalker with Matthew Wood and Lucas,in order to help them prepare for ADR recording. Semanick wanted to besure to get everything that they might possibly need, just to be safe.”My philosophy is, ‘You can always throw it out,’ and Georgewould say, ‘If you’re going to throw it out, I don’twant to do it.’ But you don’t know that until you get intoit. The dialog tracks haven’t been built yet. Maybe thebackgrounds or the music will mask the noise. In this day and age,it’s better to be safe.” Semanick estimates that about 45% of thedialog in Episode II is from production, perhaps an all-time high forthe Star War series.

When it came time to start his dialog predubs in early February,Semanick was able to put Rydstrom’s background predub in themonitor as a guide to how far to go with processing.

Once a reel was premixed, Semanick made a separate pass for dialogreverb, and he would pick and choose among Lexicon 480, TC 6000 andeven worldizing courtesy of speakers and mics in the basement of theTech Building and outside on the grounds. Semanick took note of theprograms and settings that he used to aid his fellow mixers in theforeign-language mixes (see “Day-Date Worldwide Release,” below). Noteswere also kept for pitch changing, although that would be morelanguage-dependent. While acknowledging that he could be “giving awaysecrets,” Semanick says that in the long run, “there aren’t any.You just do to taste what you think sounds great. People’s tastesare different, and if George likes it, and Ben and I like it, thenit’s good. If everyone hates it, it’s bad!”

All told, he spent approximately 15 days on the dialog, recorded asfive 8-tracks: one for production dialog, one for ADR, one forcreatures, one for loop group and another (split as separate 5-channeland 3-channel [LCR] groups) for the reverb returns to allow forflexibility during finals.

Rydstrom split his sound effects among ten 8-channel premixes, plustwo Foley premixes. While the first seven channels followed his normallayout for Dolby Surround EX mixes (L/C/R/LFE/LS/CS/RS), the eighthchannel was a recent development that Rydstrom made with Skywalkerengineer Jerry Steckling. Steckling designed a matrix to pick offfrequencies sent to the three surround channels below a certain pointand record them on a separate boom channel during premixing. Rydstromsays: “By keeping the low frequencies from hitting the surrounds, thismade for a cleaner surround track. We could regain some of the low end[that would be lost with small surround speakers]. The big benefit isthat it keeps surrounds from overloading, from reproducing informationthat they can’t handle. For a film like Star Wars with alot of spaceship-bys, it’s good to have something doing thatautomatically for you.” During final mixing, Rydstrom folded the eighthtrack into the LFE track as necessary. This surround/sub band-splittingis, of course, similar to the way DTS splits the surround (and wasindeed used as far back as 1980 for select engagements of AlteredStates).

ILM composed a high-definition version of the film off of theirserver to give Mix A at the Ranch the best of both worlds: nonlinearplayback but at highest resolution. The conformed picture was outputfrom the data files to Sony HDCAM tape, which was then transferred atthe Technical Building on the Ranch to the Avica MotionStore. The HDimage was only available for reels 1 through 5 during the mix, becausereels 6 and 7 were undergoing so many changes.

The only downside to the HD projection that Rydstrom could come upwith was that it was like mixing to a beautiful answer print the wholetime. “You never had the step in going from an ugly picture to abeautiful picture, where you say, ‘This sounds great!’ It waswonderful for lip sync, to see all that detail.”

FINAL MIX

The final mix of Episode II started on March 4, at which pointRydstrom and Semanick were joined at the Neve DFC by veteran L.A.-basedre-recording mixer Rick Kline, who would be handling the music. Thisschedule was in contrast to so many movies these days, where finalmixing and premixing overlap, and multiple stages are workingsimultaneously at the last minute. “They schedule enough time so wedon’t have to do that,” says Semanick. “George locked the pictureearly enough so that we’re not beating our heads against the walltrying to finish it up at the last second, which I think is prettysmart if you can plan it.”

Lucas was not able to be there for the predubs and began each reelat the final mix (after a play-through with faders at zero) by soloingthe dialog. He would then make ADR and reverb selects, often asking formore worldizing. “Although there are great programs available for boththe 480 and the TC [6000], sometimes worldizing is morenoisy and raw. A little grit never hurts,” says Semanick.

Lucas would then have Kline solo the music and would pick throughit, commenting on transitions and places to drop or change cues.Effects did not, as a rule, undergo this “solo microscope,” presumablybecause Lucas had heard effects throughout the picture editorial andtemp dub process.

Excepting one playback of the first four reels, the first screeningthat Lucas and the crew had of the whole movie was only days before theend of the final mix on Saturday, April 13. This and another screeningthat week for friends produced 12 pages of notes from Lucas that wereaddressed over the last days of the mix.

Many of these notes revolved around dialog intelligibility issues.Lucas had asked those attending the second screening to let him know ifany dialog wasn’t clear. Semanick, being the dialog mixer,remembers with glee Lucas’ mantra during the final mix:”Everything is subservient to dialog.”

Music was recorded by Shawn Murphy at Abbey Road Studio One inLondon, the site of the Episode I recordings three years earlier. With14 sessions from January 18-26, Murphy recorded to two 2-inch DolbySR-encoded 24-tracks. The mix was done simultaneously to Pro Tools viaa 2-inch 16-track, including a 5.1 main orchestra and 3-track (LCR)groups of synth, percussion and choir. Kline says that the music was”wall-to-wall-to-wall,” absent for only a few minutes and playing “fulltilt” for most of the time.

As to the challenge of weaving all of this music around dialog andeffects for 142 minutes, Kline says that his work was made easy notonly because of the composing of John Williams, the masterful editingof his longtime associate Ken Wannberg, and the excellent recording byShawn Murphy, but also because of Rydstrom’s deft handling ofsound effects. “Gary is such an incredible mixer. He has a real senseof the music and is very tuned-in. He’s forever creating space toallow textures of the music to come through. It was such a treat towork with his and Michael’s talents. I think it came out to be avery good blend [of dialog, music and sound effects].”

Williams had composed the film to the edit as of last December, andas a result extensive editing was required to conform the tracks forthe final mix. The smallest number of fade files in Wannberg’s ProTools sessions for a reel was 7,000; most reels had from 12,000 to14,000. Kline remembers assistant music editor Steve Galloway askinghim for some heads-up to reel changes, since sessions sometimes took 20minutes to open!

AND THEN THERE’S REEL 6

Rydstrom says that Episode II was organized so well inpost-production that, by the time they got into final mixing in March,most reels were “almost 100 percent complete and never changed.” Theone exception was reel 6, which was 1,812 feet and 6 frames full ofsome of the most intense and busy action sequences ever put on thescreen. Reel 6 is traditionally the big action reel in the StarWars series, but Rydstrom says that this one was so big “it was asif you had taken the previous four Star Wars movies andprojected them on top of each other. I would be crawling through thefilm frame-by-frame and asking, ‘Which laser did you mean this onefor, Terry, the 15th on the left of this frame?’”

Todd Busch remembers that reel 6 didn’t exist in anythingresembling its current form before last July. “The original script wasvague about what occurs outside the arena. In May and June [2001],George put the art department to work, and in July the Clone War gotfleshed out with animatics.”

Although the monster fight in the arena was originally scored, musicwas dropped at the final, which meant Burtt had to rethink athree-minute sequence because they had premixed it “in context,”against the music. “Music had been end-to-end in the reel, but wethought that it wore the audience out too quickly,” says Burtt. “So wedropped a couple of cues, which in the end was better dramatically,although I had to come up with a whole different approach to thecutting. Rather than a supporting role, it was the only thinghappening. It was quite satisfying because it was a fun challenge tosee how quickly I could come up with a couple of new concepts.”

In spite of the obvious opportunities to be loud, Lucas and hissound crew have kept Episode II mercifully free of uncomfortably loudmoments. Semanick says that, even though in the old days, 70mm 6-trackmag prints could be very loud, “When digital sound first came out, wheneveryone first got the chance to use it, it was like goingtrick-or-treating on Halloween for the first time. You ate all thecandy. It was good the first time, but you got a stomach ache. As yougot older, you saved some for later. It isn’t so good to be loudfrom beginning to end. You have to build in dynamics, like a symphony.It’s been a learning process for us all.”

Kline says, “Michael and Gary were both very sensitive to loudness.I really appreciated that they were not out to kill the audience. Theywere very aware that something doesn’t necessarily have to be loudto be effective.” During the printmastering all stems were as a ruleplayed straight across, although during the reel 6 battle Semanick didpull the stems a little bit.

Most of us mere mortals first associated the words “digital” and”filmmaking” when we heard Francis Coppola’s predictions duringthe 1978 Oscar ceremony. We had only a vague idea of what he wastalking about, but it sure sounded good.

When he finished work on Epsiode VI in 1983, George Lucas said thathe wouldn’t be doing another Star Wars movie until digitaltechnology had matured. Visual effects at that point were completelydependent on elaborate film opticals that were extraordinarilyexpensive and cumbersome. Use of digital sound in film sound editingand re-recording, except for processing and sampling, was not really onthe radar.

Ten years later, as he was helping his friend Steven Spielbergfinish Jurassic Park, he realized that digital effects were ableto blend seamlessly with humans. The time had come.

With Episode II, nine years later, picture and sound exist entirelyon hard drives, and the imaginations of the sound and visual artistsunder Lucas’ command know very little boundaries. Undoubtedly thiswill expand in the next three years as Lucas closes the door on theworld of Star Wars, as he promises he will in 2005. Whilehe’s been true to his word in the past, we can only hope thathe’ll renege this time, and that he’ll come back fresh, in2015, with another decade of innovations behind him.

Larry Blake is a sound editor/re-recording mixer who lives in NewOrleans. He is currently working on Solaris, which will be hisfirst opportunity to put sound where there is none–inspace.

sidebar: BEN BURTT ON SOUND DESIGN FOR EPISODE II

“I call Matt Wood the ‘digital architect,’ and he only reluctantlytakes on that term privately. I rely on him to keep me up with thetechnological present or future. [When we were starting work on EpisodeII], we were unable to get the support from the [New England Digital]Synclavier that we wanted, and the files did not interface comfortablywith the rest of our system. Matt wanted me to get off of it and”update myself” to Sample Cell. I could essentially do the same thingsI did with the Synclavier, but simpler. This was especially true fortaking sounds from Pro Tools into Sample Cell and then back into thePro Tools session. We used to have to digitize them into the NEDformat, and if I made a sequence or made loops, we had to use S-Link totranslate over and batch-digitize.

“The Synclavier was a performance-based instrument–I would putsamples on the keys and then play with it. Coming from an older sounddesign and technical school, I don’t like to think out ahead oftime that I want a sound to have this amount of delay, in that kind ofan echo chamber. I just want to touch something, hear it, and react. Alarge part of the sound design job is making the right choice of asound, and not really your technical knowledge. I like accidents andspontaneity, and I pick takes out of my performances that often lead tonew ideas that I wouldn’t have been able to objectively reason outahead of time. It’s a very subjective process for me.

“I may have a sound I recorded that I need to digitize from anoriginal DAT. I may want to make samples out of pieces of it and playwith it. Try it on the keyboard in different pitches, chop it up withthe modulator on the keyboard and listen to it. Try it with differentplug-in settings that I’ve made in Pro Tools. I don’t want tostop and think about how I’m going to do it–I just want to beable to synthesize with it as spontaneously as I can. To me,that’s the most direct and satisfying creative process. Out ofthose experiments or performances, I can select what’s good.

“Often, I’ll start out in the morning intending to make thesound of a certain vehicle pass-by in the film. As I experiment,I’ll come up with different sounds that I realize will work forsomething completely different–a door that I need in reel 11,say.

“From a strict library standpoint, I entered about 600 sounds in thelibrary for this film. At the end of the film I make sure thateverything that I’ve made, even outtakes, is given a name andlabel, so on the next Star Wars film I can access a database ofeverything that was done. That’s where I’ll pick up and starton the next one.”

sidebar: GETTING THE IMAGE TO THEATERS

Standard film printing procedure starts with the edited originalcamera negative, from which a handful of interpositives are made. TheseIPs are then used to produce multiple internegatives to make thethousands of release prints needed to cover worldwide release. Althoughit would have been much easier for Lucasfilm to film out, from data,one IP that would be the master from which INs and release prints wouldbe made, Lucasfilm decided to go an extra step and film out six”original negatives” from data for the U.S. print order.

Because of the long time it takes to film out a 2,000-foot reel (60hours on ILM’s ArriLaser Recorders), work started on this as soonas reels were locked in mid-February. A seventh negative was used as asource for interpositives, from which the international internegativeswould be made, putting in subtitles as necessary. ILM also made 22versions of the signature Star Wars title scroll.

Lucasfilm technical director Mike Blanchard says, “Almost all of theresolution that’s lost is through the printing process. It’sreally funny about technology and the film business right now. Peopleget caught up in these numbers games that are flat-out ridiculous. Theysay, ‘Film is 4k,’ but it’s not 4k. It’s 4k on thecamera negative, but no one has ever seen a camera negative projected.Countless studies have shown that what is shown in U.S. theaters [viathe interpositive/internegative photochemical printing process] isbetween 700 and 800 lines of resolution when you get to the releaseprint. We get that easily.

“And with digital picture [on a server], we have random access, wecan go slow or frame-by-frame–the picture looks so much betterthan film. It’s made it hard for us to [color] time the filmportion because you never get neutral prints, it’s either a pointgreen or a point dark, and is totally up to the alchemy of thephotochemical process.

“For George, it’s always been about what people would see. Youcan shoot with a film camera or a digital camera–that’s justanother choice that is out there for people–but you can’targue about the digital projection part. People would be getting abetter experience at the end of the day. We didn’t have to filmout each reel as a negative–it wreaks havoc on our schedule, andwe have to work a little bit harder–but since we can’t getdigital projection in every theater, this is something that we can doto make our prints just a little bit better.”

Blanchard notes how smoothly the digital timing of the film was atILM, which took place in Theater C, with Natasha Leonnet at thecontrols of the Pandora Pogle. “George would say, ‘That face is alittle red.’ Natasha would pull up the matte, take a bit of thered out and say, ‘Do you like this better?’ and we’dmove onto the next shot. You could time the film in the course of fivedays, easily. But doing the film side has been a really nasty processfor all of us. We see how George wants the movie to look, and it wouldlook that way in every theater if we were projecting digitally.”

Busch says that assistant editor Jett Sally would go through adownconversion (to an Avid 14:1 Meridian file) of every frame afterfinal color timing, as a final check; ILM did the same on thefull-resolution DPX files. This check was invaluable, and some smallmistakes were caught. “You no longer have a negative cutter to relyon,” Busch says. “The final version of the movie is coming out of thecutting room I don’t know how these new procedures will shake outin the end.” – Larry Blake

sidebar: DAY-DATE WORLDWIDE RELEASE

Publicity, piracy and cash flow have all factored into the recenttrend of releasing films overseas on the same date as the U.S./Canadapremiere. Lucasfilm planned to release Episode II simultaneously in 15dubbing territories on May 16, with another three to follow soon after.In addition to the standard FIGS (French, Italian, German andCastillian Spanish), Episode II was dubbed in French Canadian, LatinSpanish, Catalan Spanish, Thai, Cantonese, Hindi, Hungarian, Russian,Czech, Slovak, Polish, Turkish, Japanese and Portuguese (for Brazil;Portugal is subtitled). Worldwide release will entail going intoanother 30 countries with either a combination of English-track-only orsubtitled prints.

This “day-date” worldwide release requires a great degree ofcoordination during post-production to get foreign-language dubbedversions translated, adapted, cast, recorded and mixed. Having lockedreels starting in January, Episode II post-production supervisor JamieForester was able to have Masterwords in Santa Monica create the dialoglist, which is the transcription of the original dialog, withdefinitions to clarify idiomatic expressions–many unique to theworld of Star Wars. These were then passed on to the individualterritories for the translations and adaptation process, where wordsare tweaked to fit mouth movements more closely.

Dubbing (this term is used overseas for voice recording only, andnot as a synonym for mixing) began in March. Initial tweaking of syncwas done by sound editors in each territory, to take advantage of their”native tongue” ears. Consultants followed their version from voicerecording to the mix at Skywalker Ranch, because “you might get into acase where you want to edit for lip sync, but you don’t know whereyou can cut into a word because it might change the meaning,” Foresterexplains.

The M&E mix was done during the last days of the final mix,April 11-13, as reels were being finished. Gary Rizzo of Skywalkerworked closely with Dennis Ricotta of Twentieth Century Fox in creatingthe M&E and the crucial optional track that allows for selectivematerial, such as vocal grunts, to be used (or not), depending on thedubbing.

On Monday, April 15, mixing began at Skywalker Ranch on the fourprincipal–Parisian French, Castillian Spanish, German, andItalian–foreign-language versions of Episode II. In order tofinish all 18 dubbed versions by May 2, five stages at the Ranch wererunning simultaneously. The schedule allowed for three days of a dialogpremix, plus one day of printmastering. On the fourth night, the DolbyMO discs were played back and verified before being sent to L.A. foroptical transfer. Optical negatives for the foreign-language versionswere made at NT Audio in Santa Monica, while all the English-languagenegatives were shot at Walt Disney Studios in Burbank.

Digital cinema dubbed versions were expected to be shown in FIGS andJapan. There are six subtitled digital versions in addition toFIGS.

In addition to re-recording mixer Michael Semanick’s extensivenotes on processing of voices, the foreign mixes were also aided bystandardization in the track layout. Character track placement in thePro Tools sessions was redone at the Ranch to ensure that Anakin wouldalways appear on track 1, Padmé on 2, etc. This would allow themixers, who would be doing up to three versions each, to re-use consoleautomation such as sends levels. The five mixers of theforeign-language versions were Tom Myers, Gary Rizzo, Lora Hirschberg,Jurgen Scharpf and Kent Sparling. – Larry Blake

Sidebar: DIGITAL CINEMA RELEASE

About a month after its U.S. premiere in May 1999, Episode I wasexhibited digitally in a handful of theaters in Los Angeles and NewYork. It had been shot on film, scanned in as digital 2k files, andthen filmed out on a shot-by-shot basis, with the negative cut in thestandard fashion. This procedure required a telecine of a timedinterpositive (as is standard for home video release) tohigh-definition videotape prior to compressing the image onto the harddrives in the theaters. The digital path from camera to theater wasmore direct, of course, on Episode II, with much of the work handledin-house at ILM.

Mike Blanchard, Lucasfilm technical director, says, “Because we werein this other domain, and there was no HD editing up here, if we wantedto conform the picture, we had to find a way to do it. ILM’s videoengineering group, under the direction of Fred Meyers, built a conformsystem that would use the Avid EDLs as a guide to take the files offthe server in editorial order.” This would be necessary to create allmasters, be they 35mm anamorphic negatives for release printing, orsource elements for digital cinema compression. (The reduction for the142-minute Episode II is from approximately 1.6 terabytes of digitalfiles on ILM’s server to 60 gigs on the digital cinemaserver.)

As of press time, Lucasfilm hopes to get as many as 115 digitaltheaters worldwide for the release of Episode II, with 70 theaters inNorth America, 26 in Europe and 12 in Asia (mostly Japan). All of themajor digital cinema servers are represented: EVS (used exclusively inEurope), Q-Bit, Qualcomm (via Technicolor Digital Cinema), and BoeingDigital Cinema, which primarily uses Avica servers. This wasAvica’s third “appearance” in Episode II, the first two being onthe set, as still-store frame reference, and at Skywalker Sound for HDplayback.

Many of the Boeing installations are delivered via an encryptedsatellite link to theaters, where an Avica server stores and embeds theaudio as part of the MPEG-2 standard, as opposed to the two separatesets of files used by other servers. (The MPEG encoding is similar toHDTV, but at a much higher bit rate.) Because of this, many of theBoeing installations will have Dolby AC-3 encoded 5.1 Surround EX mixes(at the DVD rate of 448 kbits/sec), as opposed to the 24-bit/48 kHzlinear digital audio that has been standard. Blanchard notes that thiswill undoubtedly change in the future, “as soon as the remaining codingissues are solved.”

In all instances, worldwide, the projectors are the TexasInstruments DLP format, manufactured either by Barco, Christie orDigital Projection. — Larry Blake

Close