Back when I was in college, “audio production education” was something you did on your own. If you wanted to learn how records were made, you listened to them. If you were pushy and lucky, you hung around a local recording studio and watched the process, which was sometimes instructive and inspiring, and sometimes about as logical and enlightening as watching the California Election Commission design ballots.
It certainly wasn’t anything you could get in school. Heck, the idea of combining music and electronics in one’s class schedule was ludicrous, with the exception of a select few who were given the keys to the dusty, overheated, tube-filled basement closets where the rare schools with electronic music programs (open to graduate students only, of course) had their studios.
If you were a music major and wanted something as simple as, oh, having a recital recorded, you either contacted the school’s A/V squad, who showed up with a $50 P.A. mic and a clunky Wollensak reel-to-reel tape machine or you made friends with a hi-fi nut in your dorm who had a Tandberg or a Revox, and maybe a couple of RE-10s. If you were an electrical engineering major, you studied amplifier and transducer design, but if you wanted to get involved with music, your best hope was to squeeze into a couple of elementary theory courses or play 14th clarinet in the marching band. Or, of course, you could form your own rock group, which was much more fun but didn’t do much for your GPA. It was the same thing if you studied acoustical engineering or architecture: You might learn how to design a concert hall, but you couldn’t perform in it, at least not for academic credit.
And did we like it? Hell, no. It sucked. My sophomore year at a liberal arts college, I managed to hustle my way into the electronic-music lab for a couple of semesters, but then I transferred to a conservatory. Although the musical education I got there was terrific, the only things electronic in the building were the light dimmers.
I would have given anything to get into one of the numerous courses that are available today in recording, music production, film scoring and mixing, sound design, multimedia, music synthesis, live sound and so many other aspects of the audio world — had they existed. The rise of programs that relate what’s in front of the mic to what’s behind it is one of the best things that has happened to music education since the invention of the practice room.
Not surprisingly, probably the single biggest element in how audio production programs have changed is in the use of computers. Simply put, they’re everywhere. At the Berklee College of Music this year, every entering student is required to own an Apple Macintosh PowerBook. And computers are, of course, replacing a host of other stuff. The lab where I was teaching a few years ago had a Mac, but physically, it was built around a 32-input analog console, a couple of racks of ADATs, synth modules and processing gear — and a huge television monitor. In my new “multimedia” space, there are no racks at all, no television, and the mixing console we use is no bigger than the computer keyboard. But there is a honking Mac G4 under the desk and two plasma displays on top. That’s where the students’ attention goes.
In my old studio, only a few students at a time could work in the lab — and on one project at a time — whereas in my current lab, there are nine identical stations, all of which can run simultaneously and independently. Although we have a lovely pair of Event powered monitors at the front of the room, they’re only used when I’m teaching the class; when the students work on their own, they wear headphones.
I asked some other audio educators I know about how they are faring with the takeover of computers in their classrooms. As you might expect, they’re generally pretty much in favor of the idea because of the power the newest systems have and the flexibility that they offer students.
David Moulton, a pioneer in audio education who started the program at SUNY-Fredonia and has since taught at Berklee and the University of Massachusetts Lowell, says, “The DAW has become so ubiquitous that we spend virtually all of our time teaching about it or with it. The plug-in manufacturers have developed a real fixation on modeling and mimicking discrete gear, including mics, tape, outboard pieces. Loudspeaker models are coming, too, I understand. This makes it even more powerful for teaching.”
Scott Metcalfe, chairman of the music production and technology program at the Hartt School of the University of Hartford, concurs: “The DAW has moved in as the centerpiece of the studio. There was a period when we relied on the DAW only for editing and mixing, but a few years ago, we gained enough confidence in the system to begin tracking on it, as well. It is my opinion that running a session to a DAW is a much more musical process than recording to tape.”
Michael Bierylo, associate professor of music synthesis at Berklee, says, “Most of my students spend all of their time working on a computer in one way or another. In our program, the SSL board used to be the totem or icon that symbolized success: Get this down, and you’re on track. That has changed. Everyone now believes that to get a gig, you need to be a crack Pro Tools op.
“Thankfully, this hasn’t dampened the serious student’s appetite for basic skills and in-depth knowledge. In many ways, the new tools have enormous potential for developing these. If every one of our tech majors has an Mbox, they can spend untold hours learning what a compressor does without needing to be in an expensive production studio. They can work toward mastery offline. On the other hand, they could spend eight hours doing stutter edits on a verse.”
Dan Pfeifer, an associate professor in the recording industry program at Middle Tennessee State University in Murfreesboro, agrees: “Every production and technology student wants to learn Pro Tools.” But at the same time, MTSU is careful to keep more traditional configurations available. “We are very fortunate to have three fully equipped studios: Otari, SSL and Studer digital,” he says, “as well as five labs: electronic music, hard disk, post, mastering and critical listening. We use analog, DASH and hard disk multitracks.”
As computers have brought recording and production technology down the economic ladder and made it more available to a wider population, computer-based teaching studios and labs, with their smaller footprints and lower cost, have become appropriate for a wider range of students, including those who are not necessarily concentrating on becoming producers and engineers. Where once music technology was seen as an esoteric specialty, many now consider it a necessity for anyone looking for a career in the music industry.
Jeff Wolpert, a Juno Award-winning engineer/producer who has taught for many years in the Tonmeister program at Montreal’s McGill University, and who recently joined the faculty of the more vocationally oriented Humber College in Toronto to develop a music technology program, explains his plans: “At Humber College, the proposed curriculum is entirely designed to accommodate the change in music education. It addresses the need for all working musicians to become literate and skilled with music technology. Students both young and old are requesting that we provide instruction and equipment for them to practice on.”
At Berklee, says Bierylo, “Music education courses used to be a reason for musicians to justify getting a college degree, since it was a good way to hedge one’s bets on a possible paying job. To many students today, music tech offers the same thing: the sense that one’s creative endeavors will lead to a paying gig. Many students don’t even have a clear career plan; they just feel that tech skills will be their ticket to earning a living in music.”
But what of the negatives? Perhaps the most obvious is listening to the music. When labs are small and have multiple workstations or when students are working in their dorm rooms, where do you find the space to put anything but the nearest-possible-field monitors? “The big issue with us is real estate,” says Bierylo. “Being in a very pricey urban area ties our hands in terms of the amount of space we have to build studios. For the students, that translates into the infrastructure, the type of recording and listening environments they have to work with. What good is a high-end mic pre or converter if you’re going to end up mixing on headphones?”
Another issue is that, with all of the fancy tools available to students and so much of the production process now seeming so easy, the basics could get lost. Many educators feel that those need to be stressed even more. “The front end of recording hasn’t changed much in the past few years,” says Wolpert, “in terms of microphone applications, preamps and signal processing to the recorder. Therefore, there must be a strong emphasis on the use of transducers and routing. At McGill, we have the luxury of requiring that the students spend most of their first year recording ensembles live to 2-track.”
Metcalfe relates, “One of my favorite assignments is to have students do a recording with microphones that cost less than a few hundred dollars. They can’t rely on the U87s, TLM193s, 414TLIIs, etc; they have to understand the limitations of the microphones they’re using and use them effectively. I like to tell them that if Ansel Adams were alive today, he’d be able to take brilliant photographs with a disposable camera from the local convenience store.”
In audio education, just as in every aspect of the modern world, new technologies bring new opportunities and new problems. But it’s important to keep in mind what is supposed to be served by these technologies: the music. Metcalfe says, “You can have the knowledge to set up all of the equipment for a session, rip apart a studio and rebuild it, and make custom modifications to equipment, but without a thorough understanding of music, you won’t make good recordings. I regularly tell students that the most important class they take for enhancing their skills as producers is their [instrumental or vocal] ensemble.”
Pfeifer sees the same issue, although from a different perspective. “We are in a College of Mass Communication, not a school of music,” he says. “So musicality is often an issue with our students. All production and technology majors are required to take a semester of music theory or our Musicianship for Engineers course. In any production class, we constantly incorporate music and listening skills throughout the course, and the focus is much more on the artist, songs and music than on engineering. Now, I might sound old, but I find the musicianship of the average student has been dumbed down: Many lack any sense of melody and harmony. Bad intonation, rhythm and ensemble playing don’t seem to matter much to many of them. This is more a reflection of what they listen to than anything else.”
For better or worse, students — and how they relate the technology to their own musical ideas — will have a major role in the shape of the music industry in the years to come. Computers and music software have made the means of production available to the musicians themselves, and audio education programs have raised musicians’ level of awareness so that they can use these new tools effectively and professionally. The result is that recording engineers are no longer passive players in service to musicians; now, they are the musicians. As Wolpert puts it, “The term ‘music production’ tends to mean more and more that you are now the artist, engineer, producer and mixer. In other words: The person you are most likely to produce will be yourself.”
Paul D. Lehrman teaches at Tufts University, in Medford, Mass., among other things.