Friday, June 1, 2012

Final Projects!


Each spring for the past four years, I've taught a class called 'Creating Sounds from Scratch.'  SfS starts out exploring old synthesizer technology through a digital lens, looking at oscillators, filters, and midi to create entirely new sounds or to give a unique sheen to old sounds. The bulk of the class time, however, is spent digging into the details of a program called MaxMSP. Max is a piece of software that lets the user create complex creative packages for sound & video based on data and numbers. In SfS, we use Max primarily as a data mapper, taking real-world control (pressing keys or twiddling knobs on a keyboard) and turning that data into opportunity for creative control.

SfS is a required class for MFA Sound Designers here at UCI, but this year, I let the class open up to students of various backgrounds and aptitudes.  It was a great challenge for me, as I found myself having to frame the class in such a way to engage both the students who were comfortable and versed in this kind of thinking and control as well as the students who were totally new to this kind of work. I must admit that I didn’t think it had gone particularly well, but yesterday, my mind was changed.  Final projects were due recently, and I was consistently delighted by the kind of work that the students did.  So impressed, in fact that, I want to share them with you.

The prompt for the final project was simple: ‘Use digital tools to create a modifiable soundscape of an event.’ Over the past few weeks, we looked at ways to get real-world control data into Max and ways to use that data to create/control sonic events.  Let’s look at what the students came up with:



Bryan is an undergrad with a focus in sound for games.  He was my student in a different class in the fall, and I thought that SfS would be useful to him in his game work.  He created a warzone, using his Android phone running TouchOSC as a controller.  Bryan did an excellent job in prioritizing energy and attention on foreground sounds v. background sounds (a requirement for the fast-paced world of game design).  Here’s a screenshot of his work:




Dean (an MFA Stage Manager) and Jenna (an undergraduate Sound Designer) created a car-driving sequence using a Wii Remote as a controller. The user controls speed of the car and volume of the radio using the gyroscopic parameters of the controller, and the buttons on the remote take us through a sonic event, ending in a surprising car crash (spoiler alert!)




Elliot (an undergrad Sound Designer), Pablo (an MFA student in the ICIT program in the computer music wing of the Music Department), and Stephen (an MFA Sound Designer) worked together to create a comprehensive sonic event, tracking the result of a botched bank robbery during a Fourth of July celebration. A wii remote served as a master controller, an iphone (running TouchOSC and Facetime) streamed data and audio to the host computer), and the bank safe was cracked in real time using a cardboard box modded with various sensors.




Kat (an undergrad Sound Designer) and Phil (an MFA Stage Manager) created a very precise aural replication of Super Mario Brothers to be played using a Wii Remote.  They used chance to re-create events, but they used brute force programming to get the sounds to behave correctly.




Kristen (an MFA Stage Manager), Michael (another ICIT student), and Patricia (an MFA Sound Designer) created a space battle using a JazzMutant Lemur as a controller.  They implemented their battle using an eight-channel delivery system. Unfortunately, the user loses the space battle.

You may, by now, noticed a theme of catastrophic death in these projects. Don’t fret – the theme continues.




Phillip, another undergrad game sound student, created a medieval sword fight for two Wii Remotes.  He created comparison events to trigger sonic events: if only one ‘sword’ was swung, that sword would hit, but if both ‘swords’ were swung, they would parry and clang off of each other.




Sinan, our final undergraduate Sound Designer, also created a sword fight, but his vocabulary was Star Wars light sabers.  Sinan built his sonic event using theme music, differently-pitched sounds for light-side or dark-side fighters, and victory music that changed depending on which side won:





I was thoroughly impressed by all of these projects, and I’m proud of them all!  I’m looking forward to teaching the class again next spring!

No comments:

Post a Comment