Collaborators:Sam Hu and Aven Zhou

What this project was aiming to do was to create a machine learning + human musician relationship using a normally unconvential instrument, the theremin. What we wanted to do was create a digital theremin using a LeapMotion for hand detection and tracking, mainly for the ease of input and a lack of tools/resources for creating an actual theremin (but that's potentially on the to-do list), and then input it as MIDI into a pretrained neural network using Google Magenta.

The user simply waves their hands above the LeapMotion, and after they stop playing the neural network generates a melody to be played back to them with the tone and style of a theremin.