Project3 – Statement

by Golan Levin @ 10:44 pm 30 January 2011

For the results of this assignment, please see the page: “New Interactions with Kinect and Computer Vision”.


Real-Time Augmentation / Interaction. Critiques on February 21, 2011.
This project is concerned with creating a system that responds to signals or information interactively, in real-time. It’s anticipated that your system will responds to signals from a human user/visitor/player/interactant. For example, you might make a tool, an instrument, a game, a toy, or some other kind of responsive experiential system. For this assignment, your system will probably respond to a camera of some kind.

Developing an interactive system will ask you to think about augmenting human experience through an exploration of interactive feedback in the context of a user’s high-bandwidth, continuous and real-time signals — whether intentional or inadvertent; gestural or spoken; or based on any other measurable property or behavior. You might develop, e.g. a gesture processor, a drawing program, a transformative mirror, an audiovisual instrument, or some other system which allows a participant to experience themselves and the world in a new way.

In the past two months, the space of interactive experiences has been blown wide open by the introduction of the Microsoft Kinect depth camera. Open-source toolkits for most arts-engineering environments (openFrameworks, Processing, Max/MSP, OSC) are now freely available. We have been fortunate to secure 12 Kinect cameras through a small grant from the CMU Computational Thinking Center. I will loan these dozen cameras to 2-person teams that don’t already own their own device. Although using the Kinect is not a requirement for the assignment, this is a great opportunity to work in a wide-open space.

* Is it OK if it’s not “body-interactive”? Sure! For the record, there’s no requirement that your system respond to human (e.g. body) signals; you could examine, e.g. car traffic, birds, pets, machines, microscopic amoebas, etc. Consider an ‘ambient’ surveillance system that responds to the movements of pedestrians many yards away. Or an automobile traffic-monitoring system that uses a public video feed. Or a system which responds to signals from a turtle in a terrarium, a fish in a fishtank, or some ants in an antfarm. The core requirement is that your project consume real-time image data – responding to unpredictable signals.

Please give some consideration to the potential of your software to operate as a cultural artifact. Can it attain special relevance by responding in ways which address a real human need or interest?


Details about this Project’s deadlines and deliverables:
(Please use the “Project3″ category tag for all related posts.)

  • For Wednesday, February 2: Looking Outwards. Identify 3 interesting projects that employ computer vision or camera-based interaction in some way. Be sure to look at projects made with Kinect as well as projects made without Kinect. On Wednesday I’ll present my own favorites and many historical examples. To get started, have a look at MediaArtTube’s “Mirror” playlist; there are hundreds of other good projects.
  • Monday, February 7: Project 3 Sketch due. Have prepared, in a blog post, a sketch (napkin drawing and descriptive paragraph) of your project idea(s). We’ll workshop the class in small groups.
  • Monday, February 21. Project 3 Due.

Kinect Resources:

  • Processing. Daniel Shiffman wrote the Kinect library for Processing; you can get the latest version of it here. He also did a nice writeup and basic tutorial on how to use the Kinect with Processing. Place the Kinect library in the ‘libraries’ folder in your Processing directory.
  • OpenFrameworks. Stick with OF v.0.062 for now. Theo Watson and others (including our TA, Dan Wilcox) created the OpenFrameworks addon, ofxKinect, which you can get here. This is Mac-only for the present time.
  • Cinder. Build Cinder from the GitHub repo; instructions are here. Similar to OF addons, Cinder has CinderBlocks; get the Kinect CinderBlock here. Cinder and Kinect seem to work with Windows as well as Mac.
  • Max/MSP/Jitter. Jean-Marc Pelletier has created jit.freenect.grab, which works in Max5 on Mac OSX 10.5 and later.
  • Flash. Bindings have been made available at the as3kinect project. Note that Flash must connect to Kinect data over a socket.
  • TuioKinect. Martin Kaltenbrunner has created TuioKinect, which produces OSC messages from Kinect capture. This allows use of the Kinect with virtually any other environment that can receive OSC. Tested under OSX 10.6 only.
  • OpenKinect. Wrappers are provided for many additional languages, including Python, C, JavaScript, and LISP.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Interactive Art & Computational Design / Spring 2011 | powered by WordPress with Barecity