John Horstman – Final Project

by jhorstma @ 4:09 am 10 May 2011

My final project is an interactive video installation named Twilight Conductor. By using arm gestures in front of a webcam, the audience can “conduct” a sunset, changing the color of the atmosphere and flinging stars up into the evening sky.



We had freedom to set our own course for the final project, so I chose to create a webcam video installation. Many of the interactive video installations we had seen in class really impressed me, such as Utterback’s Text Rain, Liquid Time and Untitled 5, and Levin’s Interstitial Fragment Processor. I liked their ability to draw people in with their immediate responsiveness and sense of playfulness. Though the Kinect’s depth cam offers potentially deeper interactive experience, I wanted to create a piece that anyone could download and run on their local machine; many people have access to a webcam, but very few folks have a Kinect available.

The idea for conducting a sunset came from the book/movie “The Phantom Tollbooth.” The character Chroma the Great is in charge of conducting the world’s sunset each day, controlling the unique sounds and colors and placing all of the stars in the sky.



I began by coding the color changes of the sky based on motion in front of the camera, because that was the most complex part. I chose to capture the motion with optical flow for several reasons. First, because it was capable of capturing both the amplitude and direction of the motion in the camera feed; this was important for both changing the sky color in the appropriate places according to the audience participant’s motion and for throwing the stars in the corresponding direction and velocity. Second, there is already an optical flow function included in the OpenNI openFrameworks add-on. The add-on did not include a wrapper for this function, and since I’m a pretty inexperienced openFrameworks programmer Prof. Levin had to give me some guidance there before I could use the function in my code.

My code identifies the motion in the camera feed, divides the screen into 20 bins from left to right, and totals the motion in each bin. The sky is drawn as a triangle fan, with 20 triangles drawn radially from an origin at the bottom center of the screen. The total motion in each horizontal bin is used to calculate the color change at the triangle fan vertices: greater total motion creates a darker blue, eventually darkening all the way to black.

The stars in the sky are Box2D objects, created using the Box2d openFrameworks add-on. If the amplitude of the motion in the scene totals above a certain threshold, then the code fires stars into the sky in the direction of the motion. Each star has a mass and friction, so that they fire from the origin with a certain velocity, then slow down and stop so that they hang in the sky. The direction of the stars is determined by summing all of the optical flow vectors into a total motion vector. Each star radius is calculated from a randomization function. The stars twinkle by randomizing the value of the alpha channel of each star. The stars slowly move across by the sky by applying a rotation matrix to the star positions and incrementing the angle of rotation.

The ambient sounds in the scene are two WAV files created by the user offthesky at The Free Sound Project. One sound is attached to each audio channel, and the volume of each sound file is set in proportion to the amount of motion in each half of the screen (left and right). A WAV file of a chime noise is played upon the creation of each new star.



As mentioned earlier, Prof. Levin helped me to get the optical flow function working, which was critical to the success of the project.

I would have liked to change the color of the sky in a manner more similar to a real sunset, fading from light blue to red to dark blue, but I had difficulty controlling the RGB channels in a way that mimicked this behavior.

One of the big hurdles that surfaced at the time of installation was tuning the code to behave well in the installation environment. There were several coefficients in the equations that controlled the the motion of the stars and the changing the color of the sky that had to be adjusted in order to create a pleasant, predictable interaction.

In its current state, the code crashes after a certain number of stars are fired into the sky, which was happening about every 3–5 minutes at the final project exhibition. I think this could be resolved by adding some logic to call the destructor on stars that fall out of bounds of the screen – for which there is currently a function in the Box2D add-on, though that function is not working at the time of this writing.


Future Revisions

To improve upon the project, I’d like to revisit the color calculation algorithm to make the sunset behave in a more realistic way.

There are a few features that I had wanted to add earlier, but ran out of time before the exhibition. For example, I was thinking about adding generative sound synthesis to the piece rather than just playing back WAV files; adding a more responsive audio dimension could create a richer interactive experience. I’d also like to add some physics to make the stars generate constellations; if I could get my hands on a data file with coordinates for the stars that appear in major constellations, I don’t think this would be too difficult.


Source code

Download the package

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2018 Interactive Art & Computational Design / Spring 2011 | powered by WordPress with Barecity