Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Bikes, Harps and Yo-Yos

TEACHING ENGINEERS AND ARTISTS TO TALK TO EACH OTHER

One of the hottest fields in the research world is Computer/Human Interfacing (CHI), which has several other names, such as Tangible Interfaces, Human Factors, Design Psychology and probably many more. A mixture of psychology, physiology, industrial engineering and computer science, CHI tackles how humans interact with machines to make those interactions more efficient, more productive, less fatiguing and, from the humans’ standpoint, richer. In other words, make tools that do something useful, are fun and/or rewarding to use and can be used for a long time.

Because there are so many disparate disciplines in this field, schools don’t have an easy time figuring out how to teach it. Getting the students from all these areas to meet on common ground is a problem. For example, in the school where I teach, a relatively small liberal arts — oriented college, there is a strong engineering program, but engineers and arts and humanities students rarely cross paths and almost never collaborate.

But almost every student is into music, regardless what he or she is studying. So a few years ago, the engineering faculty decided to face the challenge of how to get their students to work with others, and the answer was to create courses around music technology and not what most schools consider music technology: computer composition and sound manipulation. (I already teach that.) In this program, the students would build musical instruments.

The first course was called Musical Instrument Engineering and it’s strictly acoustic. Students build flutes, guitars, zithers and bagpipes, and do research that requires them to measure various environmental effects on things such as trumpets and piano actions. But the course still drew students primarily from Mechanical Engineering. To enlarge the pool and make the course attractive to more students, they decided to create a course called Electronic Musical Instrument Design.

Because I’m pretty much the only faculty member who knows a lot about electronic music (I told you it was a small school), they asked me to write the course curriculum and teach it. I’ve created a lot of college courses, but, at first, this one felt quite a bit beyond me. I’m not a mechanical engineer, most of my electronics knowledge is self-taught and although I’ve played a lot of instruments, I’ve never built any. Nevertheless, I took heart in a friend of mine’s (also a part-time professor) maxim: “The first time you teach a course, it owns you. The second time, it’s a draw. The third time, you own it.”

I’ve just finished teaching the course for the third time and I think I’ve got a handle on it. I can say it’s become one of the most rewarding enterprises I’ve undertaken in my teaching career despite — or because of — the fact that it’s an insane amount of work. Better yet, the students concur: Many of them tell me they’ve never put so much time into a course and they’ve never had so much fun.

The premise behind Electronic Musical Instrument Design (EMID) is straightforward: Students design and build working models of musical instruments that use electronics to produce and control the sound. The course is listed in the Music and Mechanical Engineering departments’ catalogs. But because the college doesn’t have an Audio Technology program, the number of students who can handle every aspect of the course is somewhere between few and none. Think about it: You need to know how to make objects of wood, metal or plastic that people can push, pull, swing or bang on; how to put sensors on them to measure what you’re doing; how to translate the sensor data into something a computer can understand; how to communicate it to a synthesizer of some kind; and how to set up the synthesizer to make meaningful sounds out of the data.

When students sign up for the class, I poll them on their backgrounds to make sure all of the different skill sets needed to undertake the projects are present. I ask them if they play an instrument, if they’ve ever worked with MIDI and synthesis, how they are at designing and building small electronic circuits, how their wood and metal shop skills are and what they know about acoustics and object-oriented computer programming. If a student doesn’t have some experience in at least two of these areas, he or she doesn’t make the cut.

The result is we end up with a wide cross section of majors. Mechanical engineers form the largest group, although by no means the majority, and just about everyone plays an instrument or at least has some DJ chops. I’ve had several music majors; other students have been electrical engineers, chemical engineers, computer scientists, business or economics majors, art history majors and even a few from the School of the Museum of Fine Arts in Boston, with whom we have a cross-registration agreement.

Midway through the semester, I divide the students into teams, arranged so that each group has all of the skills needed to complete the projects. Interestingly, electronic tinkering is the most poorly represented skill, perhaps because electrical engineering these days is taught largely using computer models and hands-on circuit construction isn’t as prevalent. Fortunately, my teaching assistant this semester — a graduate student in electrical engineering — had a lust for designing clever circuits using standard op amps and other components and threw himself into the task of supporting the other students.

Before we start building anything, the students get some technical and historical background. I go over the history of electronic musical instruments, from the Theremin to the latest experimental devices from places such as the Paris research center IRCAM, MIT and Stanford. I introduce them to the concept of “gesture controllers” and have them do creative exercises in thinking about a wide range of body movements, what devices might be used to measure them and what musical effect they might have.

I also bring in guest lecturers who are developers or virtuoso players of unusual electronic instruments. Among these have been Mark Goldstein, an expert on Don Buchla’s Marimba Lumina and Lightning systems; Mario DeCiutiis, who makes the KAT percussion controllers and is the drummer for Radio City Music Hall’s Christmas shows; Bruce Ronkin, one of the top Yamaha wind-controller players in the country; FutureMan, the very-far-out electronic drummer in Bela Fleck & The Flecktones; and Theresa Marrin Nakra, whose doctoral thesis at MIT involved wiring a jacket with position and muscle-tension sensors and getting Boston Pops conductor Keith Lockhart to wear it during a performance.

I give the students a thorough grounding in MIDI; in Max, the object-oriented MIDI “construction set” developed by IRCAM and now distributed by Cycling ’74; and in the subtractive synthesis and sampler modules in Propellerhead Reason software. For all of the projects, the audio chain is extremely simple: Reason feeds the audio outputs of our two desktop Macs and these go to a pair of Edirol powered monitors. (If the students want to make their own samples, we have an M-Audio USB audio interface and BIAS Peak.) Data comes into the computer using a set of voltage-to-MIDI and switch-to-MIDI converters made by the German analog synth company Doepfer, which we’ve customized to give us 16 controller channels and 64 binary note-on controls. The incoming MIDI commands go into Max, which interprets and processes them in a variety of ways, performing logical operations, filtering and transforming the data, and triggering sequences. The processed data then goes on to one or more Reason modules.

The front end — what goes into the Doepfer devices — is the real challenge for the students. As long as something closes a contact, produces a DC voltage or changes its electrical resistance, they can use it to input data into the system. We’ve used buttons of all shapes and sizes, rotary and slide potentiometers, force-sensing resistors, strain gauges, tiny bits of piezoelectric film, infrared and ultrasonic transceivers, contact microphones, hacked wah-wah pedals, photocells, accelerometers, flex sensors, reed switches and thimbles rubbing against a washboard.

What do they come up with? One group — an ethnomusicologist, a violinist and a mechanical engineer — created a musical yo-yo: A $2 yo-yo was equipped with four tiny rare-earth magnets on each face. A glove was equipped with magnetic reed switches on each finger that, when they closed, triggered different samples. When the yo-yo left the hand, Max would start a sequence and an ultrasonic sensor would measure the yo-yo’s distance, closing down a lowpass filter on the sound as the yo-yo approached the floor and opening it back up again on the return. Meanwhile, flex sensors attached to the wrist and forearm controlled other filter and vibrato parameters as the player moved his arm.

Another group — a violinist, a different mechanical engineer and an optics engineer — created a laser harp. Starting with a small wooden harp, the group replaced the 12 strings with the same number of lasers and photocells so that putting one’s hands where a string used to be interrupted a beam, which sent a note-on to Reason. Buttons on the front of the harp selected the scale: diatonic, chromatic or blues. More buttons along the frame next to the strings controlled vibrato depth and note durations. Foot switches transposed the instrument up or down by an octave, and a foot pedal added an FM component to the sound, making it more bright and metallic.

Then there was the über-Theremin built by a guitar-playing math major, an engineering science major and an economics major. Three infrared sensors, each with a 15-degree cone of sensitivity and each using a unique pattern of amplitude modulation so that they wouldn’t interfere with each other, were mounted on a piece of cardboard. The plan was to have the sensors somewhat overlap and read values created by the player’s hand position and translate them into Reason parameters. But the students realized that none of them had the math skills to handle the 3-D conic projections to make this work. So they separated the sensors and assigned each one to a different parameter: one controlled pitch, the second a lowpass filter and the third duration. “The only problem with this instrument,” one of the members explained, “is that you need three hands to play it.”

Two engineering students came up with the Schwinnophone: An ancient 10-speed bike was fitted with flex sensors on the front and rear brakes, the pedal crank and the derailleur, and three of the magnets from the yo-yo project were attached to the rear wheel. A reed switch was mounted on the frame so that each time a magnet came near the switch, a note-on was sent to Max. The Max patch stepped through a 5-channel sequence playing “Born to Be Wild” using Reason sampler and drum modules, with the speed of the incoming notes determining the tempo. When the rider sat down and started to crank and the rear wheel began spinning, the sequence started. As he switched gears and changed speeds, the tempo and note velocities of the sequence changed accordingly. The left-hand brake lever made a convincing whammy bar for the lead guitar. When the bike stopped, the Max patch switched to another sequence, which played a convincing rock ‘n’ roll ending.

One of the most enjoyable parts of teaching the class was watching them evaluate and modify their plans while they built the projects. Besides the technical issues — perhaps a flex sensor didn’t respond if it was mounted in a certain way or the sensitivity of the ultrasonic detector wasn’t as linear as the specs said — there were musical considerations at work. They constantly asked themselves, “How playable is this? What can we do to make it more responsive, more dramatic, more fun?” Perhaps the only thing missing from the course was having the students learn how to really play the instruments they had made. As they were solving technical problems right up until final presentations, none of them had the time to practice and master their creations.

So as much as we all got out of this class, we could see that building new interfaces is only half the battle — making them into truly expressive musical instruments is a whole ‘nother beast. Next month, I’ll take you to a remarkable conference in a beautiful city where a couple hundred people were engaged in battling that beast — and many came out victorious.

Paul Lehrman can be found on a beach on Cape Cod with a pennywhistle and a ukulele — at least until the next semester starts. Details on the EMID class are atwww.tufts.edu/programs/mma/emid.

Close