Project 4_ Cutback Remix_Deren Guler

by deren @ 12:31 am 27 March 2012

This is a new version of a project that I started a while ago, but haven’t had time to really push through. It’s pretty pointless, but I am extremely amused by household objects that respond to you. Chambers Judd has a series of small projects about giving inanimate household objects emotions, such as the “antitouch lamp” that backs away from you and the sneezing radio that cleans itself. Some motivation to keep going include blendie and noisy jello

I set out to make a cutting board that responds to you when you cut on it. The first version used 4 piezo sensors under a layer of acrylic that activated played a tone programmed using the arduino tone melody. Basically, the more you cut on it the more of the song you would play. Another version used the arduino Audiuno library and homemade force sensors acted like input to a synth, allowing you to make crazy noises while you cut. I got some interesting feedback from the initial attempt. I thought that the cutting board should respond in pain to being cut on, the way a human would, but many people said they would prefer if it made a more pleasant noise. Then came the question of “what if the cutting board knew what was on it and responded accordingly?” I have yet to think of an easy way to sense raw meat vs vegetables, or a really practical reason to try and figure it out so if you have any ideas please let me know! (The only thing I would want to do it make a cutting board that knows when you are cutting a vegetable and then fills with blood looking liquid, or squirts it at you).

So! in this version I changed the design of embedding the sensors (because the last one broke after less than an hour of use) and decided to try and interface with MAX/MSP. I used a Serial patch in MAX to read the analog input data from the piezo sensors in the cutting board. I also decided to try and stick the sensors in the wood, rather than sandwiching them between two layers of wood. I experimented with pouring epoxy on some milled out channels in the wood with the sensors in the channel and found that they still had some sense. I was pretty excited about this, because I think the resin, wood embedding mix can look really nice if you do it well. I’m still working on that part, but I think the latest version is not so bad. I also experimented with where to put the sensors because I wanted to be able to differentiate between chopping, vs. slicing.

Here is the arduino code for detecting the different kinds of cuts with the previous board:

void setup() {
Serial.begin(9600);
}

void loop() {

//read sensors data
int sensorValue1 = analogRead(A0);
int sensorValue2 = analogRead(A1);
int sensorValue3 = analogRead(A3);
int sensetotal= sensorValue1+ sensorValue2+ sensorValue3;
if ( sensetotal > 200) //middle sensor, pounding
{

Serial.print(sensorValue3, DEC);
  Serial.print(" ");

}

if ( sensorValue2 > 0) //end sensors, slicing
{

Serial.print(sensorValue2, DEC);
  Serial.print(" ");

}
int sensorValue1old = sensorValue1;
int sensorValue2old = sensorValue2;
int sensorValue3old = sensorValue3;

}

and for sending data to MAX/MSP:

the MAX patch is adapted from a tutorial by Jason Safir

I have the slicing input playing a weird violin file, and the chopping playing some bongo drums. The nice part about the MAX/MSP set up is that you can easily switch out the sounds, maybe a child could mess around with that part and the mom could play the sound while she is making dinner? This would require some bluetooth action, but I think it is doable once I get a little more comfortable with serial communication and MAX/MSP.

Evan Sheehan | Project 4 | Silhouettes

by Evan @ 11:40 pm 26 March 2012

[vimeo https://vimeo.com/39245630]

My goal with this project was to create an interactive piece where the interaction was less participatory and more destructive. The notion being that the viewer would have to periodically leave the piece while it repaired itself. The plan was to have several little shadow spiders running around on the wall building their webs. As people walked by, the shadows they cast would destroy the webs and send the spiders scurrying. Not until the spiders had been left alone for a period of time would they gain the courage to return and rebuild their webs.

The code is available on Github.

Inspiration

[vimeo https://vimeo.com/38704159]

I can’t say this animated retelling of Little Red Riding Hood served as inspiration, exactly, since I discovered it after I’d settled on doing something with shadows. But I think it is illustrative of the other-worldliness shadows can take on. I wanted the piece to seem as if it were a glimpse into a world that existed beside the one in which we’re accustomed to living. Plus, I just think it’s a cool animation and wanted to share it.

[vimeo https://vimeo.com/22219563]

I had a lot of help with the visuals on this project from this project by IDEO. I got the method for rendering the silhouettes directly from their code with a few modifications. This project also served as a model for creating the interactions between the Kinect data and the Box2d world.

Tests

[vimeo https://vimeo.com/39245036]

Step 1: Use the Kinect to interact with springs in Box2d. It took several tries to get a project up and running with both ofxKinect and ofxBox2d included. This was due to a possible bug in Xcode, or else just my own misunderstanding of its interface to the build tools.

[vimeo https://vimeo.com/39245259]

Step 2: Construct a spider web from Box2d components. After getting the Kinect integrated with the Box2d code, I set about figuring out how to construct a breakable web of threads. Randomly constructed webs (as seen in the final video) worked, but didn’t hold their shape as well as I wanted. I tried this octagonal construction, which worked well in this test, but less well in the subsequent test with the Kinect as input, instead of the cursor.

[vimeo https://vimeo.com/39245320]

Step 3: Break spiderwebs using data from the Kinect. The octagonal web construction was apparently too strong for me. No matter how I tuned the parameters the web would either not break at all, or it would fall apart instantly. Ultimately, I decided to go with the randomly generated webs.

Kaushal Agrawal & Eli Rosen | Project 4 | Wall Fish

by kaushal @ 6:46 pm

Wall Fish

Wall Fish is a collaborative project between Kaushal Agrawal and Eli Rosen.

Inspiration
[vimeo=vimeo.com/1007230 width=”640″ height=”400″]
[youtube http://www.youtube.com/watch?v=IvbgAmwEX_A&width=640;height=auto;]

The Project

[youtube http://www.youtube.com/watch?v=JSWBENZBFqE&w=640&h=360]

Concept
We decided on the concept from the beginning. We envisioned a wall projected seascape that would become activated by the presence of a person. We wanted to support both passive and active engagement with the installation. Each person that moves past the installation is pursued by a school of fish. If the person is in a hurry they may not even notice their contribution to the seascape, as their fish “shadow” follows behind them. A more curious individual can engage actively with their school of fish. Standing in front of the aquarium environment allows your school of fish to catch up. They circle curiously around you but are startled by sudden changes in direction. Touching the seascape generates a small electric shock. Any nearby fish will be electrocuted. The escaping fish will flee from the electricity. Of course, fish have no memory. They’ll soon return to your side so you can shock them again.

The Technical Stuff
The wall fish installation uses processing and the Kinect to accomplish blob tracking and touch detection using OpenCV. A big thanks to Asim Mittal who helped enormously with a nice piece of code for detecting touch with the Kinect. Eli worked on creating the art, the fish behaviors, and the interactions. The behaviors and interactions were developed on the computer in response to mouse movements. Eli adapted a flocking algorithm by Daniel Shiffman. Meanwhile Kaushal developed the framework for tracking multiple people’s movement across the installation and for detecting a touch. He used Shiffman’s openKinect library. Kaushal then adapted Eli’s code, replacing mouse movement with blob tracking, and replacing the mouse click with a touch on the projected display.

Challenges
One of the challenges was detecting touch on a wall surface with a skewed perspective and needed calibration with the Kinect and the Projection. We decided not to use OpenNI for detection of people so it was difficult to track multiple people in cases of collision and overlap, so we ended up using probabilistic detection of blobs nearest to the previous blob position.

Trying to accomplish an interesting set of behaviors for the school of fish was also a challenge. Working only with human position and speed, we wanted to create a personality for the fish that was both curious and tentative. In order to create this effect we experimented with a number of behaviors and interactions.

Eli Rosen and Kaushal Agrawal – Project 4 – Wall Fish

by eli @ 6:44 pm

Wall Fish

Wall Fish is a collaborative project between Kaushal Agrawal and Eli Rosen.

Inspiration
[vimeo=vimeo.com/1007230 width=”640″ height=”400″]
[youtube http://www.youtube.com/watch?v=IvbgAmwEX_A&width=640;height=auto;]

The Project

[youtube http://www.youtube.com/watch?v=JSWBENZBFqE&w=640&h=360]

Concept
We decided on the concept from the beginning. We envisioned a wall projected seascape that would become activated by the presence of a person. We wanted to support both passive and active engagement with the installation. Each person that moves past the installation is pursued by a school of fish. If the person is in a hurry they may not even notice their contribution to the seascape, as their fish “shadow” follows behind them. A more curious individual can engage actively with their school of fish. Standing in front of the aquarium environment allows your school of fish to catch up. They circle curiously around you but are startled by sudden changes in direction. Touching the seascape generates a small electric shock. Any nearby fish will be electrocuted. The escaping fish will flee from the electricity. Of course, fish have no memory. They’ll soon return to your side so you can shock them again.

The Technical Stuff
The wall fish installation uses processing and the Kinect to accomplish blob tracking and touch detection using OpenCV. A big thanks to Asim Mittal who helped enormously with a nice piece of code for detecting touch with the Kinect. Eli worked on creating the art, the fish behaviors, and the interactions. The behaviors and interactions were developed on the computer in response to mouse movements. Eli adapted a flocking algorithm by Daniel Shiffman. Meanwhile Kaushal developed the framework for tracking multiple people’s movement across the installation and for detecting a touch. He used Shiffman’s openKinect library. Kaushal then adapted Eli’s code, replacing mouse movement with blob tracking, and replacing the mouse click with a touch on the projected display.

Challenges
One of the challenges was detecting touch on a wall surface with a skewed perspective and needed calibration with the Kinect and the Projection. We decided not to use OpenNI for detection of people so it was difficult to track multiple people in cases of collision and overlap, so we ended up using probabilistic detection of blobs nearest to the previous blob position.

Trying to accomplish an interesting set of behaviors for the school of fish was also a challenge. Working only with human position and speed, we wanted to create a personality for the fish that was both curious and tentative. In order to create this effect we experimented with a number of behaviors and interactions.

Mahvish Nagda – Project 4 Proposal – Tangible Feedback

by mahvish @ 8:54 am 20 March 2012

For this project, I wanted to explore something wearable with haptic feedback and how that could interact with the Kinect.

My idea right now is to create a kinect hadouken with haptic feedback using a glove, but ideally I’d want it to be more of a shirt/dress and make the interaction richer in some way.

A quick google search showed me that I wasn’t the first to think of this:
Hug Shirt Site
[youtube=www.youtube.com/watch?v=VHp8XcSaRTs]

Haptic Glove to Help Blind Navigate
Audio + Haptic

Hadouken:

MadelineGannon-Looooooking

by madeline @ 8:26 am

Treachery of Sanctuary | Chris Milk, Creator’s Project
[youtube http://www.youtube.com/watch?v=_2kZdl8hs_s?rel=0&w=560&h=315]

Ice Angel | Dominick Harris and Cinimod Studio
[vimeo 37726421]

Nir Rachmel | Looking Outwards | Interactive

by nir @ 12:23 am

Color Gun

http://youtu.be/iBVsIf1XxZ8?t=6s

This is a pretty cool project done in another interactive arts class, at 2007. What I like about this project is the element of surprise that it has. When the student is about to pull the trigger on his head in front of the projector, I didn’t know what to expect. Then when it was pulled, and the splash of color “sprayed” out of his head, just like the crows in the classroom I laughed. When an artifact like that triggers a spontaneous reaction from people, I think it hits the spot and does the exact right thing. As simple as it is (the simpler the better!).

SXSW 2012 Interactive Movie winner!

This “movie” is one of the SXSW 2012 winners. It’s pretty weird and is interactive with the mouse. Basically what I like about this movie is that you are not given any instructions whatsoever or feedforward as what you are supposed to do. But once you start clicking, you interact with the shapes (and later this funny figure) and get a lot of different feedback – both audio and visual that guide you through this “movie”. I could not make any sense of what exactly is going on there, but it is pretty cool! Too bad it’s using flash.

Here’s the link http://blabla.nfb.ca/.

Aquatypes

Another winner of the SXSW 2012 awards. I really liked this one: The users send txt messages, and those are translated to a fictional creature according to pre determined rules. Those in turn are displayed on the screen. What I like about this piece is the simplicity of interaction – everyone can do it with their phone. In addition, I like the artists attitude as saying that this is a metafore for words in general. Once they are out of your mouth, they are independent in a way and act on their own.

Joe Medwid – Looking Outwards – 4

by Joe @ 10:49 pm 6 March 2012

Zombie Frog Drum Kit

[vimeo=http://vimeo.com/31923751]

The name… says it all? A MIDI drum kit is used to coordinate electrical shocks directed at the bisected bodies of preserved frogs. Morbid? Sure. An ingenious interaction? Undoubtedly.

Rope Revolution

[vimeo=http://vimeo.com/26217095]

A collaboration between the MIT Media Lab and the Harvard School of Design lead to this little gem, a series of video games based on rope-related activities throughout the world. In the age of Kinect and Wii, it’s refreshing to see an input method so thoroughly grounded in a tactile medium.

Spyro The Dragon: Skylanders

[vimeo=http://vimeo.com/30858589]

You need to fast-forward quite a bit to get the point of this clever video game / real life mashup. The gimmick here is that you can buy physical action figures which are then “Transported” into a video game, unlocking the figurine as a playable in-game avatar. Not only is it an ingenious marketing strategy, it redefines the concept of “downloadable content” and playfully blurs the line between game and reality. If I were ten years younger, I would be eating this stuff up.

« Previous Page
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Interactive Art and Computational Design, Spring 2012 | powered by WordPress with Barecity