It is hard to believe that nearly two weeks have passed since my first day at CCRMA! Quite a lot has happened, and I will try to cover the highlights in this post. In short, I am incredibly happy to be part of this amazing program and to be surrounded by so many like-minded individuals. I've been extremely busy, but in all the right ways.
This quarter, I am enrolled in five classes:
- Music 250a: Physical Interaction Design for Music
- Music 220a: Fundamentals of Computer Generated Sound
- Music 192a: Foundations of Sound Recording Technology
- Music 320: Introduction to Digital Audio Signal Processing
- Music 201: CCRMA Colloquium
Music 250a - Physical Interaction Design for Music is my first class every Monday morning. If you're curious about the meaning of "physical interaction design," the
course website has a great overview:
In recent years, technologies for synthesizing, processing and controlling sound, as well as those for embedded computing, sensing and inter-device communication have become independently mature. This course explores how we can physically interact with electronic sounds in real time. A series of exercises introduces sensors, circuits, microcontrollers, communication and sound synthesis. We discuss critically what the merging of these technologies means for music and art. Along with new technologies, what new music practices or art forms may emerge?
In the broader sense, this course deals with interaction design: What happens when human behaviours meet those of machines? How do the devices we use determine the style of interaction? How do we design for the limitations of human performance and the affordances of machines.
The course initially consists of labs designed to expose students to electronics/sound programming and homework assignments that encourage exploration of higher-level design questions like "what is an expressive action?" and "what modes of feedback are important for effective interaction?" Halfway through the quarter, we will draw upon our new insights and technical skills to propose and execute a final design project. Specifically, the goal is to develop and build a new, sensor-enabled musical instrument that opens up interesting and unexplored means of expression.
Our first assignment was to think about the distinction between handles (analog input) and buttons (discrete input) and sketch five examples of each. A few of my favorites from my set are shown below.
|
Play button on my Akai reel to reel. The
interesting thing about this button is that it
can only be controlled in the "on" direction.
Only the stop button can release it... |
|
Knob on the Line 6 POD guitar amp simulator (handle).
Provides both tactile and visual feedback. |
|
Power button on Mackie HR824 studio monitors.
Beautifully designed. I love these buttons. |
|
Pitch wheel on a synthesizer (handle). Provides some
force feedback via the hidden spring mechanism (abstracted
as a single spring in this drawing). |
|
We learned some great sketching techniques in class,
which were very helpful in completing the assignment -
Especially since the last time I did any detailed sketching
on paper was probably 5-6 years ago as an undergrad. |
|
Guitar pick as a handle |
|
|
|
|
|
|
|
|
|
|
|
|
|
Our first lab exercise involved getting familiar with
Pure Data (Pd), which is a free, open source version of
Max/MSP. Both tools are "graphical" programming languages, so-called because programs are constructed by linking or "patching" together objects on the screen. Interesting side note - the namesake of the Max language is Max Mathews, "grandfather of computer music" and Professor Emeritus at CCRMA. - I have spent a decent amount of time with Max/MSP over the past year and a half, so it was both fun and frustrating to make the switch to Pd. Once I figured out some of the equivalent objects and tools, I had a lot of fun building my first Pd instrument (pictured below).
|
My Pd "patch" for 250a Lab 1. It is a simple sound
sequencer/glitcher with reverb size and width control
using mouse x/y coordinates. |
My second class, Music 220a - Fundamentals of Computer Generated Sound, is taught by the director of CCRMA - Dr. Chris Chafe. This class is more of an overview of the history of and practice of computer music with a technical component involving the ChucK programming language. ChucK was developed by one of our faculty (Ge Wang) when he was a PhD student at Princeton and is a very neat tool for "on-the-fly audio programming." Some musicians use it for "live coding," in which the performer literally codes up a piece on stage, projecting his/her laptop screen so the audience can see the commands as they are entered. ChucK is especially good at handling concurrent processes, which is extremely important for real time applications like live coding. I'm sure I'll have more to say on this later once I have more experience "ChucKing."
This week for 220a we are building hydrophones (underwater microphones) and taking a field trip off the coast of Monterey to find and record whales. How cool is that?!?!? I will definitely update the blog with pictures and any recordings I obtain once we take that trip!
The rest of my classes - 320 - Intro to Digital Audio Signal Processing, 192a - Foundations of Sound Recording Tech, and 201 - Colloquium have been equally as interesting. 320 has been mostly review up until now, but will soon be getting much more complicated. By the end of the quarter I should know the basics of filter design and spectrum analysis. In 192a we've covered the principles of sound, basic audio electronics, and psychoacoustics (how we perceive sound). This week we should be getting into various microphone types and will be spending some time in one of the CCRMA studios.
At last week's colloquium, each of the Master's students (myself included) spoke for 5 minutes about their previous work and current interests. It was fantastic to get a better idea of the interests of my classmates and to see the diversity of work that they have produced. I spoke very briefly about my time at Metron and shared two recordings - one excerpt of a controlled feedback experiment and the first few minutes of "Pollen Chair," a piece I composed for a modern dance in NYC back in 2006 (see
my portfolio for the full recordings). I was very happy to receive some positive comments from Dr. Chafe regarding the controlled feedback piece. It seems like we have very similar interests, so I am hoping to learn more about his research in the coming weeks!
This Tuesday, amid the chaos of the first round of assignments, I managed to free up enough time to make a trip up to San Francisco to see one of my musical heroes perform a set at the Swedish American Hall.
Fennesz is an Austrian guitarist/composer who generates enormous/overpowering soundscapes using his guitar, Max/MSP, and a suite of other software. I have been following him for about 5 years but have never seen him live. This tour is the first time he has been in the states in quite some time. Two of my classmates came along for the show, and it was an incredible evening. One of the best parts of the night was simply having the opportunity to discuss the music I love with people who are as crazy about it as I am. I have missed that type of interaction for a long time, so it is wonderful to finally experience it again!
|
Fennesz melting the room in San Francisco. |
If you are interested in Fennesz but don't know where to start, this video provides a nice intro. The albums
Endless Summer,
Venice, and
Black Sea are great entry points as well.
Finally, this Friday night the music department had it's annual start-of-the-year BBQ on the CCRMA lawn. Here are a few pictures from the event.
|
This is the amazing view out the back of the CCRMA building. |
|
Free food is always a plus for poor grad students!!!! |
Thanks for reading! Please check back soon for more updates!