Andre Le

16 Mar 2014

I’m still working out the details for my final project, but I’ve always been fascinated with the ability to gain “superpowers” from technology. For example, being able to perceive something that coexists with us in the real world, but is undetectable with human senses.

The following projects have inspired me to see how else we can map invisible world, such as electromagnetic fields, radiation, or air quality. What if we use the Oculus Rift and a sensor array to map, overlay, and experience all of the real-time sensor data in the world?

What can these technologies tell us about ourselves? From a quantified self approach, what if a wearable heart rate or galvanic skin response sensor can detect your stress or excitement level and relay that to your Pebble watch?

Does knowing this undetectable information change your behavior? Does the behavior change last even without the augmentation? Is it possible for wearables to re-wire our brains and act as extensions to our bodies?

EIDOS
(http://timbouckley.com/work/design/eidos.php)

Eidos Vision is a project by Tim Bouckley, Millie Clive-Smith, Mi Eun Kim, and Yuta Sugawara that allows users to overlay visual echoes on top of their vision. This allows users to perceive time visually, and become aware of their temporal surroundings.

 

The Creators Project: Make it Wearable: Becoming Superhuman
(http://thecreatorsproject.vice.com/blog/make-it-wearable-part-4-becoming-superhuman)

The Creators Project has a great blog post on several other wearable technologies that allow people to sense the world in ways that were previously impossible. A notable one was with Neil Harbisson, who is compensating for his colorblindness with a device that maps color to sound.

Spider Sense Suit

The Spider Sense Suit is a collection of ultrasonic distance sensors and servos attached at various locations on the body to provide feedback on the proximity of the wearer’s environment. This project was created by Victor Mateevitsi and showcased at the Augmented World Expo 2013 where I witnessed a live demo. Aesthetically, it wasn’t much to look at, but the possibilities were impressive. By mapping distance sensors to pressure, his body was able to quickly and automatically adapt to the stimulus around him.