Epic Jefferson

11 May 2015

Signal, a free-hand gesture-based instrument for sound manipulation and performance.


This project is an exploration in alternatives for interaction with sound editing and synthesis techniques, in this case granulation.

Gestures
Currently there are 2 gestures implemented, Selection and Triangulation.


For the Selection gesture (right hand), the distance between the thumb and index finger determines the size of the window, which in turn determines which area within the sample (the subsample) that is processed by the granulation engine.


For the Triangulation gesture (left hand), the distance between the thumb and index alter the pitch, the distance between the thumb and middle finger set the grain duration. That’s it. This is already a very sensitive and expressive setup, and the key is in the mapping. Which aspect of the gestures should control which synthesis parameters? And which function should I apply to the leap data to provide the most interesting results? I think this will prove to be the great challenge of this project.
I’m happy with where it’s headed. And since I’ll be in Pittsburgh during most of the summer, I’ll have some time to work on it before next semester starts. Oh, right, this is going to be my Thesis Project for the Tangible Interaction Design program.

Sound Engine
Currently, I’m using Max for the audio engine, but it’s likely I’ll return to Pure Data to allow for embedding the engine within the application itself, and not run anything separately. It seems that sending ALL of the leap data over OSC is too much for Pd to handle (so far, this is only an issue for osx). So, the obvious fix is to only send the necessary data, the minimum. There’s still the possibility of using a C++ lib like Maximilian. Here’s my previous post on the subject.

The Studio for Creative Inquiry interview.

Future work
The next thing to be implemented is the Gesture Recognition Toolkit (GRT) library so people can teach the system their own gestures and possibly replace mine for specific tasks, like selection. Currently, the GRT library has a conflict with OpenFrameworks 0.8.4 (which is what I’m using) about C++11, here’s the forum post. This seems to have been resolved for OF version 0.9, which will be released in a few weeks, I hope. For now, it’s recommended to use a previous version of GRT, when C++11 was not yet added.

On the interface side, I’ll be incorporating a textured surface for the right hand to regain some tangibility in the interaction and rest from muscle fatigue. This should also help with repeatability in the selection gesture.

For anyone interested, I’ll post future updates on epicjefferson.com

Get the code: github.com/epicjefferson/signal