This project consists of a prototype system for performing music and sound design live, created with Max/MSP and Ableton Live and programmed around Numark’s Orbit wireless DJ controller. The system was originally created for a production of A Midsummer Night’s Dream, commissioned by theater company New Place Players. It was performed in NYC from 2014 to 2016.
This sound design project started with the idea of the spirit of the Fairies existing inside a sound world (for an overview of the play, click here), that their doings and tricks would occur in the sound/music domain, so to speak. So, for example, with a finger snap, ‘other-worldly’ sounds would surround the spirits’ subjects and envelop them with their magic.
Initially, I envisioned the actors using special ‘wearable’ sensors, for example, a special ring or gloves, or sensors attached to their clothing – an idea that I didn’t get to realize in this production, but which I think is still worth pursuing. At any rate, in the process of developing this particular rendition of the play, the director came up with Indonesian-style puppets that were to be played by the fairies; all along I was in charge of following their actions with sound and music, live.
While doing my research, I learned about Numark’s Orbit, a wireless DJ controller, which proved to be a welcome prototype, for the following reasons:
I programmed the Orbit using the following logic: The wireless controller is connected to a Max for Live device, which then routes commands to various tracks in Ableton Live (the latter storing all the variety of sounds and instruments used in the play).
To start, there is a meta layer that specifies which act or scene is currently active (each scene has its own set of sound & music cues to be triggered and performed).
Then, depending on the scene, there are various categories of sound that are controlled:
1) Standard sound cues may be triggered.
2) Various layers of sound may be triggered at the same time, with various degrees of control as to how they will be performed: - Volume - Brightness - Wet/Dry mix of special effects 3) Musical sequences may be controlled in various ways.
Here is an overview of the device itself: it has 4 banks (the "Pad Bank" black buttons '1 - 4' at the bottom of the device); each bank has its own set of 16 buttons (the white buttons on top, labeled '1 - 16'). It has a volume wheel in the center, which can control 4 different parameters (by selecting the "Virtual Knobs" buttons 'K1 - K4' at the top), and two buttons ('SB1' & 'SB2' at the very top) which activate the accelerometers (horizontal and vertical).
Here is an excerpt of the control map for the 16 buttons, plus volume controls (K1 - K4) and accelerometer buttons (left and right), and what they’re programmed to perform in each particular scene:
For example, in scene 1.1, eight of the 16 white buttons have been programmed to trigger various sound cues:
In another example, scenes 2.2 through 3.1 have been mapped into three sections:
Here’s a screenshot of Ableton Live’s sampler, hosting the set of sound cues for scene 1.1:
This is a work in progress: a more thorough overview will be posted soon!