Xing Xu-Project4-pingpong game using QR code

by xing @ 1:11 am 27 March 2012

This is a very simple project using QR code(zxing) in Processing. The zxing library is an open-source, multi-format 1D/2D barcode image processing library implemented in Java, with ports to other languages. It is at here: http://code.google.com/p/zxing/

It is a pingpong game. which is very easy to be understood how to play. I am thinking that, in the future, I could make it a multi-player game, and/or add more visual and sound effects, and/or change the interaction of the game design a little bit. For example, the path of the ball maynot be in a striaght line, the shape of the “ball” becomes a rectangle……..

[youtube http://www.youtube.com/watch?v=WhQXynBr40k&feature=youtu.be]

Project 4_ Cutback Remix_Deren Guler

by deren @ 12:31 am

This is a new version of a project that I started a while ago, but haven’t had time to really push through. It’s pretty pointless, but I am extremely amused by household objects that respond to you. Chambers Judd has a series of small projects about giving inanimate household objects emotions, such as the “antitouch lamp” that backs away from you and the sneezing radio that cleans itself. Some motivation to keep going include blendie and noisy jello

I set out to make a cutting board that responds to you when you cut on it. The first version used 4 piezo sensors under a layer of acrylic that activated played a tone programmed using the arduino tone melody. Basically, the more you cut on it the more of the song you would play. Another version used the arduino Audiuno library and homemade force sensors acted like input to a synth, allowing you to make crazy noises while you cut. I got some interesting feedback from the initial attempt. I thought that the cutting board should respond in pain to being cut on, the way a human would, but many people said they would prefer if it made a more pleasant noise. Then came the question of “what if the cutting board knew what was on it and responded accordingly?” I have yet to think of an easy way to sense raw meat vs vegetables, or a really practical reason to try and figure it out so if you have any ideas please let me know! (The only thing I would want to do it make a cutting board that knows when you are cutting a vegetable and then fills with blood looking liquid, or squirts it at you).

So! in this version I changed the design of embedding the sensors (because the last one broke after less than an hour of use) and decided to try and interface with MAX/MSP. I used a Serial patch in MAX to read the analog input data from the piezo sensors in the cutting board. I also decided to try and stick the sensors in the wood, rather than sandwiching them between two layers of wood. I experimented with pouring epoxy on some milled out channels in the wood with the sensors in the channel and found that they still had some sense. I was pretty excited about this, because I think the resin, wood embedding mix can look really nice if you do it well. I’m still working on that part, but I think the latest version is not so bad. I also experimented with where to put the sensors because I wanted to be able to differentiate between chopping, vs. slicing.

Here is the arduino code for detecting the different kinds of cuts with the previous board:

void setup() {
Serial.begin(9600);
}

void loop() {

//read sensors data
int sensorValue1 = analogRead(A0);
int sensorValue2 = analogRead(A1);
int sensorValue3 = analogRead(A3);
int sensetotal= sensorValue1+ sensorValue2+ sensorValue3;
if ( sensetotal > 200) //middle sensor, pounding
{

Serial.print(sensorValue3, DEC);
  Serial.print(" ");

}

if ( sensorValue2 > 0) //end sensors, slicing
{

Serial.print(sensorValue2, DEC);
  Serial.print(" ");

}
int sensorValue1old = sensorValue1;
int sensorValue2old = sensorValue2;
int sensorValue3old = sensorValue3;

}

and for sending data to MAX/MSP:

the MAX patch is adapted from a tutorial by Jason Safir

I have the slicing input playing a weird violin file, and the chopping playing some bongo drums. The nice part about the MAX/MSP set up is that you can easily switch out the sounds, maybe a child could mess around with that part and the mom could play the sound while she is making dinner? This would require some bluetooth action, but I think it is doable once I get a little more comfortable with serial communication and MAX/MSP.

Evan Sheehan | Project 4 | Silhouettes

by Evan @ 11:40 pm 26 March 2012

[vimeo https://vimeo.com/39245630]

My goal with this project was to create an interactive piece where the interaction was less participatory and more destructive. The notion being that the viewer would have to periodically leave the piece while it repaired itself. The plan was to have several little shadow spiders running around on the wall building their webs. As people walked by, the shadows they cast would destroy the webs and send the spiders scurrying. Not until the spiders had been left alone for a period of time would they gain the courage to return and rebuild their webs.

The code is available on Github.

Inspiration

[vimeo https://vimeo.com/38704159]

I can’t say this animated retelling of Little Red Riding Hood served as inspiration, exactly, since I discovered it after I’d settled on doing something with shadows. But I think it is illustrative of the other-worldliness shadows can take on. I wanted the piece to seem as if it were a glimpse into a world that existed beside the one in which we’re accustomed to living. Plus, I just think it’s a cool animation and wanted to share it.

[vimeo https://vimeo.com/22219563]

I had a lot of help with the visuals on this project from this project by IDEO. I got the method for rendering the silhouettes directly from their code with a few modifications. This project also served as a model for creating the interactions between the Kinect data and the Box2d world.

Tests

[vimeo https://vimeo.com/39245036]

Step 1: Use the Kinect to interact with springs in Box2d. It took several tries to get a project up and running with both ofxKinect and ofxBox2d included. This was due to a possible bug in Xcode, or else just my own misunderstanding of its interface to the build tools.

[vimeo https://vimeo.com/39245259]

Step 2: Construct a spider web from Box2d components. After getting the Kinect integrated with the Box2d code, I set about figuring out how to construct a breakable web of threads. Randomly constructed webs (as seen in the final video) worked, but didn’t hold their shape as well as I wanted. I tried this octagonal construction, which worked well in this test, but less well in the subsequent test with the Kinect as input, instead of the cursor.

[vimeo https://vimeo.com/39245320]

Step 3: Break spiderwebs using data from the Kinect. The octagonal web construction was apparently too strong for me. No matter how I tuned the parameters the web would either not break at all, or it would fall apart instantly. Ultimately, I decided to go with the randomly generated webs.

Kaushal Agrawal & Eli Rosen | Project 4 | Wall Fish

by kaushal @ 6:46 pm

Wall Fish

Wall Fish is a collaborative project between Kaushal Agrawal and Eli Rosen.

Inspiration
[vimeo=vimeo.com/1007230 width=”640″ height=”400″]
[youtube http://www.youtube.com/watch?v=IvbgAmwEX_A&width=640;height=auto;]

The Project

[youtube http://www.youtube.com/watch?v=JSWBENZBFqE&w=640&h=360]

Concept
We decided on the concept from the beginning. We envisioned a wall projected seascape that would become activated by the presence of a person. We wanted to support both passive and active engagement with the installation. Each person that moves past the installation is pursued by a school of fish. If the person is in a hurry they may not even notice their contribution to the seascape, as their fish “shadow” follows behind them. A more curious individual can engage actively with their school of fish. Standing in front of the aquarium environment allows your school of fish to catch up. They circle curiously around you but are startled by sudden changes in direction. Touching the seascape generates a small electric shock. Any nearby fish will be electrocuted. The escaping fish will flee from the electricity. Of course, fish have no memory. They’ll soon return to your side so you can shock them again.

The Technical Stuff
The wall fish installation uses processing and the Kinect to accomplish blob tracking and touch detection using OpenCV. A big thanks to Asim Mittal who helped enormously with a nice piece of code for detecting touch with the Kinect. Eli worked on creating the art, the fish behaviors, and the interactions. The behaviors and interactions were developed on the computer in response to mouse movements. Eli adapted a flocking algorithm by Daniel Shiffman. Meanwhile Kaushal developed the framework for tracking multiple people’s movement across the installation and for detecting a touch. He used Shiffman’s openKinect library. Kaushal then adapted Eli’s code, replacing mouse movement with blob tracking, and replacing the mouse click with a touch on the projected display.

Challenges
One of the challenges was detecting touch on a wall surface with a skewed perspective and needed calibration with the Kinect and the Projection. We decided not to use OpenNI for detection of people so it was difficult to track multiple people in cases of collision and overlap, so we ended up using probabilistic detection of blobs nearest to the previous blob position.

Trying to accomplish an interesting set of behaviors for the school of fish was also a challenge. Working only with human position and speed, we wanted to create a personality for the fish that was both curious and tentative. In order to create this effect we experimented with a number of behaviors and interactions.

Eli Rosen and Kaushal Agrawal – Project 4 – Wall Fish

by eli @ 6:44 pm

Wall Fish

Wall Fish is a collaborative project between Kaushal Agrawal and Eli Rosen.

Inspiration
[vimeo=vimeo.com/1007230 width=”640″ height=”400″]
[youtube http://www.youtube.com/watch?v=IvbgAmwEX_A&width=640;height=auto;]

The Project

[youtube http://www.youtube.com/watch?v=JSWBENZBFqE&w=640&h=360]

Concept
We decided on the concept from the beginning. We envisioned a wall projected seascape that would become activated by the presence of a person. We wanted to support both passive and active engagement with the installation. Each person that moves past the installation is pursued by a school of fish. If the person is in a hurry they may not even notice their contribution to the seascape, as their fish “shadow” follows behind them. A more curious individual can engage actively with their school of fish. Standing in front of the aquarium environment allows your school of fish to catch up. They circle curiously around you but are startled by sudden changes in direction. Touching the seascape generates a small electric shock. Any nearby fish will be electrocuted. The escaping fish will flee from the electricity. Of course, fish have no memory. They’ll soon return to your side so you can shock them again.

The Technical Stuff
The wall fish installation uses processing and the Kinect to accomplish blob tracking and touch detection using OpenCV. A big thanks to Asim Mittal who helped enormously with a nice piece of code for detecting touch with the Kinect. Eli worked on creating the art, the fish behaviors, and the interactions. The behaviors and interactions were developed on the computer in response to mouse movements. Eli adapted a flocking algorithm by Daniel Shiffman. Meanwhile Kaushal developed the framework for tracking multiple people’s movement across the installation and for detecting a touch. He used Shiffman’s openKinect library. Kaushal then adapted Eli’s code, replacing mouse movement with blob tracking, and replacing the mouse click with a touch on the projected display.

Challenges
One of the challenges was detecting touch on a wall surface with a skewed perspective and needed calibration with the Kinect and the Projection. We decided not to use OpenNI for detection of people so it was difficult to track multiple people in cases of collision and overlap, so we ended up using probabilistic detection of blobs nearest to the previous blob position.

Trying to accomplish an interesting set of behaviors for the school of fish was also a challenge. Working only with human position and speed, we wanted to create a personality for the fish that was both curious and tentative. In order to create this effect we experimented with a number of behaviors and interactions.

Luke Loeffler – Looking Outwards

by luke @ 8:56 am 22 March 2012

A number of projects from robotic artist David Bowen. He often makes robotic sculpture that extends the capabilities of plants or animals and allows them to interact in different ways. I’m interested in extending realm of interaction between different species.

Fly Blimps:

IR Drawing Device:

Networked bamboo:

David Bowen Portfolio

Looking Outwards

by blase @ 7:09 am

soak

Soak, Dye in light. by everyware (2011)

In this project, there is a large, elastic canvas. When an observer pokes it, a Kinect notices this distortion and then uses a GPU to calculate a pattern of paint “soaking” based on an accelerated cellular automata algorithm. It is touted as a a new way to paint a canvas.

While I do think the paintings made using it look cool, I’m not sure that the person interacting with this artwork really has the fine grained control to make new artwork that it implies. Whereas I would think this would be a cool project for someone who doesn’t want to have much control over the colors on screen but wants to make something cool, e.g. a kid, I’m not sure someone who likes painting would really enjoy it. In that sense, it works as an art project, but maybe not as an interaction project.


The Poking Machine by Jasper van Loenen & Bartholomäus Traubeck (2012)

Continuing my poke-themed Looking Outwards, this project converts Facebook pokes (wait, is that still a thing?) into IRL pokes. The project consists of an ATtiny microcontroller programmed using the Arduino software, a servo (to move the poking nub), battery, and bluetooth module inside a box. This bluetooth module connects to an Android phone running Processing, a combination of platforms that is certainly sure to be free of bugs.

The cool demo video, in the art hacker aesthetic with upbeat music and lots of camera jumps focusing on the construction of the device, is nice to watch. However, I’m not sure it really shows us much about the design decisions, like the yellow lasercut box that seems a bit unsightly. Also, I question the choice of platforms, and whether Facebook pokes are still part of the public consciousness.


Van Gogh’s Starry Night Interactive by Petros Vrellis (2012)

This interactibve version of Starry Night normally uses Open Frameworks to basically animate the famous Van Gogh painting. However, when a person pokes the painting, they deform it. The painting is animated as if each stroke were a particle in a fluid simulation. The multitouch tracking is made possible using the OFXkinect library.

While also mostly poke-themed, I think this project succeeds more than the other two projects because it’s simply really cool to look at. I think the animation brings out the iconic characteristics of the original painting (the long strokes), making it appear in new ways. I like it.

HeatherKnight – LookingOutwards5

by heather @ 8:56 am 20 March 2012

“No telepresence robots allowed” How machines transform our relationships with each other

http://bleekercomics.com/?p=1251

The above comic was used as the title slide for a panel on telepresence robots at last weeks annual Int’l Conference on human robot interaction. Panelists included Leila Takayama from Willow Garage, a representative from Vgo, and Hiroshi Ishiguro, the guy that created a robot that looks like himself.

“This an opportunity to enhance the human identity” Ishiguru talking on having various doppelganger telepresence robots.

“Invisible Mercedes” A very shy vehicle.

[youtube=”http://www.youtube.com/watch?v=ZIGzpi9lCck”]

“This Amazing Device Just Made Wheelchairs Obsolete for Paraplegics” GIZMODO


http://gizmodo.com/5894489/segway+style-device-for-paraplegics-puts-wheelchairs-to-shame

AND FOR FUN…

“Quadrotors play James Bond Theme Song”

[youtube=”http://www.youtube.com/watch?v=_sUeGC-8dyk”]

Mahvish Nagda – Project 4 Proposal – Tangible Feedback

by mahvish @ 8:54 am

For this project, I wanted to explore something wearable with haptic feedback and how that could interact with the Kinect.

My idea right now is to create a kinect hadouken with haptic feedback using a glove, but ideally I’d want it to be more of a shirt/dress and make the interaction richer in some way.

A quick google search showed me that I wasn’t the first to think of this:
Hug Shirt Site
[youtube=www.youtube.com/watch?v=VHp8XcSaRTs]

Haptic Glove to Help Blind Navigate
Audio + Haptic

Hadouken:

Luci Laffitte – Project 4 – Looking Outwards

by luci @ 8:44 am

[youtube http://www.youtube.com/watch?v=aGsmFLCcDSw&w=560&h=315]

Make a Chemical Reaction

Create a Chemical Reaction is an interactive exhibit in the Science Storms wing of the Museum of Science and Industry in Chicago. Using specially-tagged pucks, visitors can grab atoms from the periodic table and combine them to cause chemical reactions.

 

 

[youtube http://www.youtube.com/watch?v=cbEKAwCoCKw&w=560&h=315]

Fun Theory- The World’s Deepest Bin

“To throw rubbish in the bin instead of onto the floor shouldn’t really be so hard. Many people still fail to do so. Can we get more people to throw rubbish into the bin, rather than onto the ground, by making it fun to do?”

 

 

Google Map enabled exploration:

State Parks- http://naturevalleytrailview.com/

Museums- http://www.googleartproject.com/

« Previous PageNext Page »
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Interactive Art and Computational Design, Spring 2012 | powered by WordPress with Barecity