February 20, 2019
We were asked to build a MIDI instrument for our second Tangible Interaction assignment. Our project was to provide a tangible interface that could be figured out, manipulated, and played with variation by the user in a repeatable fashion. I was inspired by maracas and set out to build a MIDI version using an accelerometer sensor to detect shaking motions.
My first step in building a MIDI instrument was to learn more about the protocol and build a simple button based instrument. I followed Tom’s MIDI USB and MIDI button instrument tutorial which allowed me to send MIDI commands from an Arduino MKR 1010 to a software synthesizer on my laptop.
Since I’m learning Max in the IDM Sound Studio course, I decided to send the Arduino’s MIDI commands to a Max patch that forwarded it to my laptop’s internal synth. Using a Max patch also allowed me to debug different MIDI commands, parameters, and the sounds of my laptop synth. I constantly referenced the general MIDI standards and MIDI events.
Once I had a working microcontroller based MIDI instrument, I then rented an ADXL 335 accelerometer sensor from the ITP shop and built a simple circuit that would calibrate and print to serial the G forces registered by the X, Y, and Z axis inputs. I thought an accelerometer would be an appropriate sensor after the feedback Stefan received on his hourglass like kitchen timer project to look into an accelerometer or gyroscope to register the turning over motion/event.
Working with the ADXL 335 also taught me about using an AREF pin as an “analog reference” voltage for higher accuracy. One thing to note is that for the MKR board, a slightly different constant (
AR_EXTERNAL) needs to be referenced when activating the external AREF.
The first way I combined the MIDI circuit with the accelerometer circuit was to trigger a MIDI “note on” command when the G forces on the X axis exceeded 0 (video demo). I then experimented the levels of possible musical interactions by adding a button that would change the MIDI instrument sound (video demo), giving the single X axis two musical dimensions.
I continued to expand on my instrument’s interactivity by incorporating the recorded calibration axis values I had gathered from the sample accelerometer circuit. When the acceleration exceeded an absolute threshold (which would happen when the circuit is moved quickly, i.e. shaked by the user), the MIDI commands to “note on” and “note off” would be sent for the given axis affected (video demo of shaking, video demo with button option). Since the acceleration value could be variable based on how fast the user moved the instrument, I decided to also utilize that value by mapping it proportionally to the MIDI note velocity value. This created the effect that the faster the shake, the louder the note.
I returned to the ITP junk shelf to find an appropriate enclosure. There was a mailing tube that was large enough to fit the electronics, but small and round enough to be gripped by one hand. It also reminded me of a rainstick — an instrument I typically associate with percussive ensembles.
The tube’s bottom was made of metal, so to avoid conductive issues and provide padding for the electronics I filled the tube with soft fabric scraps. I also ran the tube through the band saw to shorten its length.
I then purposely drilled the hole for the instrument’s panel mount button in the tube’s middle to indicate the instrument should be held with one hand and the button pressed with the thumb. The photos below are from me testing different drill sizes on the tube’s unused other half.
During my testing, it became apparent that the MIDI note velocity cue wasn’t significant enough to indicate a difference between shaking the instrument faster or slower. So I incorporated an LED thats brightness was proportional to the acceleration too.
Some thoughts I had if I continue to iterate on this idea:
NYU ITP documentation blog.
Words are my own.