Category Archives: CapstoneProposal

Epic Jefferson

28 Apr 2015

setbacks

Machine Learning – In the days before the  final critique, most of my frustration has been caused by attempting to get the GRT machine learning library to get along with openframeworks. Even though there are tutorials such as this one, there is apparently an issue with C++11, still unresolved.

Pure Data – Since moving to osx, I’ve not been able to process OSC data an audio DSP at the same time. This was a very strange glitch that I think has been solved. With Dan Wilcox’s help, we found that sending all of the leap data over OSC is too much and clogs the port, so the immediate solution is to only send the data that I need, but his is not favorable since I also want to release this app as a tool publish leap data over OSC.

successes

The selection gesture is working relatively well and has a good feel, although the GRT problem came up, it should be relatively easy to set a standard gesture like grab to keep the selected area.

Mano1

Triangulation gesture – this gesture seemed very intriguing and now that it’s implemented, I found out it is. Although the map-to-what-parameter issue is very much still in the air, the basic interaction itself is quite satisfying.

Centroid

((x1+x2+x3)/3,(y1+y2+y3)/3,(z1+z2+z3)/3)

Mano6

Interface

It’s clear that I’m working on 2 distinct features here: an editing tool and a performance instrument. And so these 2 should be on different pages. For the most part, the selection interface looks like the right direction to go, but I have some ideas about the performance interface. Here are some sketches of what that could look like:

mmontenegro

28 Apr 2015

A Hand Game:

 

calibration_1

Calibration in Unity

 

For my final project I decided to do a puzzle hand game which would live in your hand. To do this I am projecting onto the users hand and tracking its movements with the use of the Leap Motion.

cube_hand

Init test of calibration in Unity

FullSizeRender

Init test of calibration in Unity

After troubleshooting a lot to get to map the hand with the projector in Unity, I finally started thinking and polishing the game idea. I want to embrace the fact that the puzzle game has to live in your hand. But what is a hand game?

After doing some research and talking to people, the lines in our hands are unique. Everyone knows about them and we all have them. With this in mind I decided to use them as the main interaction. The lines will consist of walls in a maze with the objective of taking a ball to certain parts of the hand.

I did some research on the lines in our hands and selected the 4 main ones: (http://www.wikihow.com/Read-Palms)

  • Heart: {“Selfish”, “Love Easily”, “Less interest\n in Romance”};
  • Head: {“Clear/Focused”, “Physical”, “Creative”};
  • Life: {“Strength &\n Enthusiasm”, “Manipulated\n by Others”, “Plenty
  • Fate:  {“External Forces”, “Family & Friends”,”Controlled\n by Fate”};

 

From this lines, I selected 3 different ones for each. This way the user gets to experience the different meaning of the lines.

So at the beginning of the experience the hand will start of one color representing one of the 4 categories. The user will have to choose a line for each category to construct its playground:

blue_handgreen_handred_hand

After testing it, I realize that the text was very hard to read, so I am thinking on a way to iterate over it. Once the user has selected the lines, when they turn the hand over, the lines will be there:

init_test

Version 1

hand_test2

Version 2

 

 

 

I did some research in puzzle games done with projection mapping:

  • Meta Field Maze – Bill Keays
  • Marble Madness
  • Q-Bert

 

 

 

 

Epic Jefferson

22 Apr 2015

Ok, so the project has expanded to try to answer an additional questions

  • “can 3d gesture interfaces provide an *actual* alternative to current sound editing techniques?”
  •  “what gestures can be used to improve sound editing?”
  •  “which gestures are appropriate and intuitive?”
  • “what are intuitive/appropriate gestures for the execution/performance of sound?”

This is essentially a gesture recognition and mapping problem. First, how do I accurately register that a specific gesture is being executed? Then, how that gesture does that gesture relate to the sound process, both conceptually and technically?

While researching gestures and sound I found Jules Francoise’s work on Machine Learning systems (Gaussian Mixture Regression) for continuous motion sonification.

Thanks to Shawn Xu from CMU’s HCI program for pointing me towards the Gesture Recognition Toolkit (GRT) for C++, which I’ll be testing the Hidden Markov Model (HMM).

Here’s what I’ve found on machine learning for openframeworks:

Visual

On the visual side of things, I found Kyle McDonald’s ofxAudioDecoder addon, which I’ve managed to blend into the leap motion world.

In the image above, we can see a black line across the waveform. This line is related to the selection gesture as previously illustrated. In this gesture, the space between the index finger and thumb determine size of the selection window. The position of the wrist relates to the starting position of the window.

Mano1

Sound

On the sound side of things, I’ve had to switch to Max/MSP because **Pure Data was unable to process audio and OSC messages and the same time**, WTF?
This changes things up quite a bit since I was planning on integrating pd entirely into the app with ofxPd. Since Max can’t do this (to my knowledge), I think I’ll search for other alternatives a bit later on, something that’ll allow me to release a single stand alone application that will process the leap motion controller data directly and perform the sound manipulation/synthesis, possibly a C++ library like Tonic or Maximilian. We’ll see what happens.

mmontenegro

22 Apr 2015

 Hand Projection Calibration

good_calibration_2

Calibrating the hand has two main stages. Aggregating the right amount of points and then shifting the camera. To aggregate the points needed to successfully map your hand with the projector we need to save the 3D point of the leap motions index finger and the 2D point of where the mouse thinks that index finger is. In the image bellow you can see the calibration process.

calibration_process

 

This image shows what happens when you don’t aggregate the points correctly. The image will be offset from your hand.

bad_calibration

In contrast, this images show how it looks in OF when  you have successfully calibrated the hand projections.

good_calibration_1      good_calibration_3

 

ST

21 Apr 2015

Hello all,

This post will update you on my capstone project progress. My original proposal is here.

The act of scrolling will drive these browser-based animations. Each animation focuses on perpetual vertical motion. These are the 4 designs that I will be implementing:

IACDMockup

These animations will be created using a mix of SVG, hand-drawn, and computationally generated elements. They are primarily a formal study and my main focus is making them aesthetically.

As far as the programming is going, I have been able to get and send information between the scroll bar and p5.js, the library I am using to build the animations. I have skeleton code ready to go with two functions: scrollingUp() and scollingDown(). Each animation will behave differently.

In the next week, I will be building the remainder of the animation assets and programming the behavior of each animation.