Robert Zacharias

14 Jan 2016

This demonstration project combines live data read from a Kinect with a realistic cloth animation engine in openFrameworks.

I am particularly drawn to it because of its very interesting simulacrum nature: the user moves their hand in such a way that would make an actual cloth billow and respond, except there is no cloth actually in front of their arm. The cloth is virtual and the interaction with it is virtual as well; but my guess is that if this were reasonably well-implemented with a VR system many users could suspend their disbelief and enter the world of touching a piece of cloth. I wonder, in that moment, if they would feel the cloth in their hand? Even ever so lightly?

This small demonstration video isn’t especially great on its own, unfortunately, but really just points to what I think is an interesting possibility. The author’s (Kamen Dimitrov) work is a modification of prior work on this same kind of representation by Victor Martins. Martins’s work is obviously pretty close to the outcome that Dimitrov presents, though I believe Dimitrov’s rending is better, and his choice of virtual material a bit more realistic looking.

 

An openFrameworks addon I think is particularly interesting is called ofxSatellite, by Christopher Baker. It’s a “thin wrapper” of the libsgp4 library, which in turn gets satellite position data from NORAD. The addon can used to track some of the many satellites orbiting the Earth, which I find really interesting—both because it can simply be edifying to see all the flying beeping tin cans we’ve managed to throw into orbit, but also because I assume there are some satellite missing from the dataset because they’re doing classified work in the sky.