AME196: The Making of a Showcase
Hemisphere speakers and MIDI controllers abound as students in Asst. Prof. Ming-Lun Lee’s audio and music engineering class, “Interactive Music Programming,” prepare to rehearse their final
A typical Skype ringtone became a sonic adventure when played on A.J. Brown’s guitar. He used a multi-effect pedal that doubled as an audio interface while teammate Richard Carl used a Kaoss pad to manipulate the sounds.
Tallis Polashenski and classmate Betrand Nyann paid homage to video game icon Mario with footage of Polashenski playing an Outer Space-themed game. This was accompanied by background harmonies Polashenski wrote and by an array of video game sounds the two produced with a joystick and Wii remote.
Brenden Sullivan put a lot of body movement into his performance, swinging his “Wiimote” like an axe one moment, then “sawing” it through the air, as he and partner Murilo Kramar duplicated the sounds of civilization intruding on the wilderness.
This is a sampling of the technologies and devices that students in AME 196: Interactive Music Programming incorporated into their showcase of final projects at Rettner Hall.
“You can see we had eight very distinctive pieces,” instructor Ming-Lun Lee commented afterwards. “I was amazed by their performances. They were very creative.”
AME 196, offered through the Hajim School’s Audio and Music Engineering program, explores digital audio synthesis and real-time interactive technologies through the study of two audio programming languages, ChucK and Pure Data.
As teaching assistant David Kunstmann told the audience at the start of the showcase, ChucK “lets you manipulate sound in almost any way you would like. You can sequence, you can sample, you can play things backwards.”
“Then eventually we get into more interactive things. Towards the end of the course students learn to send Midi signals to and from devices or other programs besides ChucK. That’s what you’ll see a lot of today, where people are using keyboards or leap motion controllers where you can use hand motions in the air to control all sorts of sounds.”
About half the 15 students in the class this fall were from majors other than audio and music engineering, including digital media studies, music (Eastman School), computer science, art and art history and neuroscience.
The opportunity to learn programming was a big draw for many of them.
“I wanted to get into coding and programming because I had never done that,” said Polashenski, a freshman in digital media studies. She ended up writing 900 lines of coding for her project – and was inspired to sign up for a double major in computer science.
“This is very exciting and very cool to be learning a coding language that is easy to use but also relevant to musicians and programmers who are interested in music,” added Benjamin Kronk, a junior at the Eastman School. Kronk wowed the showcase audience with his violin performance of the Meditation from the French composer Jules Massenet’s opera Thais, while partner Aleem Griffiths embellished the sounds with his joystick.
“I highly value the collaborative aspect of music,” said Kronk, who is taking audio and music engineering classes as part of the Musical Arts major he is pursuing. This is an honors curriculum that enables students to craft an individualized program of study. “I really want to see technology engage with music in a way that’s easy to use.” One of his projects: Using EEG brain mapping at Rochester Institute of Technology to “create a piece of music that hopefully will engage the listener in a new and innovative way.”
Lee, a visiting assistant professor of audio and music engineering, exemplifies the interdisciplinary nature of the audio and music engineering program. He received his PhD in historical musicology and music theory from the University at Buffalo, as well as master’s degrees in electrical engineering and musicology from National Taiwan University.
In addition to his engineering qualifications, he is an active choral conductor and baritone. So in addition to teaching his students about programming languages and digital audio synthesis, he incorporates lessons on musical theory and gives his students performance advice.
“Today it is very important that students learn something that is not in their major, and do interdisciplinary research,” Lee explained. “It was good for me to combine two different fields (electrical engineering and musicology), so, in addition to sharing my love of music with my students, I wanted to let them know you can do that.”
Lee also emphasizes the importance of teamwork and encourages students to learn as much from each other as from their instructor.
For example, for their midterm project students used ChucK to create their own drum machines, and then compose pieces on them. Lee set aside one class for a “Halloween Drum Machine Show,” playing the pieces in front of the class without identifying the composers.
“The students could see how other students had created some very different pieces from their own,” Lee said. “That’s how they started to learn from their colleagues.”
“The showcase was a good experience because I got to see the creativity of other people,” Polashenski said. “People did things that I didn’t even think you could do, incorporating their own instruments and interests, which is really awesome.
“It was also a great performance experience to mix tech and music at the same time.”
Asst. Prof. Ming-Lun Lee talks to students as they set up equipment for their Sonic Showcase in Rettner Hall.
About AME 196: Interactive Music Programming
In this course, students explore digital audio synthesis and real-time interactive technologies by studying two audio programming languages, ChucK and Pure Data. They will be able to manipulate sound with MIDI controllers, laptops, mobile devices, joysticks, mice, and Wiimotes. Students will have a midterm presentation to demonstrate their programs in ChucK and at the end of the semester, we will have an interactive performance showcase. This interdisciplinary course does not require any programming experience.