https://itp.nyu.edu/NIME/#session-2

AssignmentInstrument 2: sensors > MIDI > Ableton

Create an instrument with the following components:

  1. gestural interface with at least two interface controls: one with on/off states (like a push button), one with a range of values (like a potentiometer). Your interface can either be physical (sensors connected to Arduino), or digital (for example, skeleton tracking connected to p5).

  2. controller that reads the control values, maps / thresholds / smooths them if needed, and sends out MIDI messages (note on, note off, CC). The controller can be an Arduino sketch or a p5.js sketch.

  3. sound generator (Ableton Live) that receives the MIDI messages, and plays back, synthesizes, or modulates sounds in response.

Be intentional in picking your controls: consider making your own switches out of any conductive material (rather than a standard push button); consider using sensors like flex, stretch, vibration, accelerometer, gyroscope (rather than a standard potentiometer). What gestures and form factors might each control afford?

What I Did

I built a midi controller you can control using the web camera and your hand and connected it to LogicPro X to play a synthetic instruments from Spitfire Audio Labs Collection. I started to work on another project that included using my tennis racquet as gestural interface and utilizing the Arduino but I will come back to that after working through the technical obstacles I hit.

p5.js Web Editor

A video of the interface with no audio since it was recorded with the MIDI signal sending to LogicPro X. Example of recording captured in Logic Pro X is below.

A video of the interface with no audio since it was recorded with the MIDI signal sending to LogicPro X. Example of recording captured in Logic Pro X is below.

Example of recording captured in Logic Pro X is below using the Moon Pluck instrument from Spitfire Audio Labs .

nime-week2.mp3

Hand Conductor Controller (# idea 1, digital)

Gestural interface