https://itp.nyu.edu/NIME/#session-2
Create an instrument with the following components:
A gestural interface with at least two interface controls: one with on/off states (like a push button), one with a range of values (like a potentiometer). Your interface can either be physical (sensors connected to Arduino), or digital (for example, skeleton tracking connected to p5).
A controller that reads the control values, maps / thresholds / smooths them if needed, and sends out MIDI messages (note on, note off, CC). The controller can be an Arduino sketch or a p5.js sketch.
A sound generator (Ableton Live) that receives the MIDI messages, and plays back, synthesizes, or modulates sounds in response.
Be intentional in picking your controls: consider making your own switches out of any conductive material (rather than a standard push button); consider using sensors like flex, stretch, vibration, accelerometer, gyroscope (rather than a standard potentiometer). What gestures and form factors might each control afford?
I built a midi controller you can control using the web camera and your hand and connected it to LogicPro X to play a synthetic instruments from Spitfire Audio Labs Collection. I started to work on another project that included using my tennis racquet as gestural interface and utilizing the Arduino but I will come back to that after working through the technical obstacles I hit.
A video of the interface with no audio since it was recorded with the MIDI signal sending to LogicPro X. Example of recording captured in Logic Pro X is below.
Example of recording captured in Logic Pro X is below using the Moon Pluck instrument from Spitfire Audio Labs .