Category Archives: Final Project

MacKenzie Bates

12 May 2014

Finger Launchpad

Finger_Launchpad

Tweet:

Launch your fingertips at your opponent. Think Multi-Touch Air Hockey. A game for MacBooks.

________________________________________________________________________________

Blurb:

Launch your fingertips at your opponent. Think Multi-Touch Air Hockey. Using the MacBook’s touchpad (which is the same as that of an iPad), use up to 11 fingers to try and lower your opponents health to 0. Hold a finger on the touchpad for a second and then once you lift your finger it will be launched in the direction it was pointing. When fingertips collide, the bigger one wins. Large fingertips do more damage than small ones. Skinny fingertips go faster than wide ones. Engage in the ultimate one-on-one multi-touch battle.

________________________________________________________________________________

Gameplay Video:

________________________________________________________________________________

Photos:

________________________________________________________________________________

Narrative:

In Golan’s IACD studio, he told me all semester that I would get to make a game and then the final project came around and it was time to make a game. But what game to make? I was paralyzed with possibilities and with the fear that after a semester of anticipation that I wouldn’t make a game that lived up to mine or Golan’s expectations.

After talking about what I should make a game on, Golan gave me this tool that a previous IACD student made that allows you to easily get the multi-touch interaction that occur on a MacBook trackpad, which meant that I could easily make a multi-touch game without having to jump through hoops to make it on a mobile device.

So I sat there pondering what game to make with this technology and the ideas that were instilled in Paolo’s Experimental Game Design – Alternative Interfaces popped into my mind. If it is multi-touch then it should truly be multi-touch at its core (using multiple fingers at once should be central to gameplay). The visuals should be simple and minimalist (there is no need to do some random theme that masks the game). This is the game that I came up with and it serves as a combination of what I have learned from Paolo and Golan. I think this might be the best designed game I have made yet and so far it is certainly the one I am most proud of having made.

________________________________________________________________________________

Links:

View/Download Code @: GitHub
Download Game @: MacKenzie Bates’ Website
Download SendMultiTouches @:
Duncan Boehle’s Website
Read More About Game @: 
MacKenzie Bates’ Website

Austin McCasland

12 May 2014

Abstract:

Genetically Modified Tree of Life is an interactive display for the Center for Postnatural History in Pittsburgh.  “The PostNatural  refers to living organisms that have been altered through processes such as selective breeding or genetic engineering.” [www.postnatural.org]

Model organisms are the building blocks for these organisms, also known as Genetically Modified Organisms.

This app shows the tree of life ending in every model organism used to make these GMOs, as well as allowing people to select organisms to read the story behind them.

 

Description:

History museums are a fun and interesting avenue for people to experience things which existed long ago.  If people want to experience things which have happened more recently, however, there is one outlet – the Center for Postnatural History.  “The PostNatural  refers to living organisms that have been altered through processes such as selective breeding or genetic engineering.” [www.postnatural.org].  Children’s imaginations light up at the prospect of mammoths walking the earth, or terrifyingly large dinosaurs from thousands of years ago, but today is no less exciting.  Mutants roam the earth, large and small, some ordinary and some fantastic.

 

Take, for example, the BioSteel Goat.  These goats have their genes genetically modified with spider genes so that spider web fibers are produced in their milk.  They are milked, and that milk is processed, creating huge amounts of incredibly strong fiber which is stronger than steel.

The Genetically Modified Tree of Life is an interactive display which I created for the Center for Postnatural History under the advisement of Richard Pell.  This app will exist in its final form as an interactive installation on a touch screen which will allow visitors to come up and learn more about certain genetically modified organisms in a fun and informative way.  The app visualizes the tree of life as seen through the perspective of genetically modified organisms by showing the genetic path of every model organism from the root of all life to the modern day in the form of a tree.  These model organism’s genes are what scientists use to create all genetically modified organisms as they are representative of a wide array of genetic diversity.  Visitors to the exhibit will be able to drag around the tree, mixing up the branches of the model organisms, as well as selecting individual genetically  modified organisms from the lower portion of the screen to learn more about them.  These are pulled from the Center for Postnatural History’s Database.  The objective of this piece is to be educational and fun in an active state, as well as being visually attractive in a passive state.

 

Tweet:

Visualization of the tree of life as seen by GMOs.

 

1 2 3

Spencer Barton

29 Apr 2014

Looking Glass: Little Owl Lost

Young readers bring storybook characters to life through the Looking Glass.

I wanted to explore augmented storytelling so I created a device to add content to a book. The reader guides this ‘magnifying glass’ device over the pages in a picturebook. Animations appear on the display based on where the magnifying glass is on the page. These animations add to the content of the story and let the reader explore new interactions.

I used the book Little Owl Lost by Chris Haughton.

book

I used the main character, Owl, as the focal point for the animations.

owlSleep

One of the animations is triggered as the magnifying glass device is brought to the correct position:

animations

Here is another example with before and after:

animation0 animation1

These are some of the animations for Little Owl Lost.

owl owlCry owlFall

How it works

Hardware Specs

This project is composed of only a few parts. Control of the interface happens through the arduino. The OLED screen has its own processor and SD card to store all of the animations. The two processors communicate via serial. There are also 3 Hall effect sensors, an on/off switch and batteries.

Prior tests of the OLED are in this previous post.

arduino

OLED Display

The OLED (organic LED) display comes from 4D Systems. I used their uTOLED_20_G2 display which is no longer in production. Animations were loaded as GIFs onto an SD card that lived on the display. Animations were then triggered via the supported serial interface.

Hall Effect Sensors and Magnet Tags

I use Hall effect sensors and magnets to detect where I am in the book. I have created a series of tags which consist of a group of 3 magnets in an L. Each magnet can have either a positive or negative polarity facing upwards which gives me a total of 8 unique tag combinations. The L shape of the tag enables me to determine orientation.

tags

I then placed the tags inside the front and back cover of a book. The magnetic field can be detected through multiple pages.

tagsInBook

I use Hall effect sensors to measure magnetic polarity. There are 3 sensors to correspond to the 3 magnets of the tags. The sensors are highly accurate and only detect the magnets when they are directly over the magnets.

sensors

Placement Magnet

To help the reader find the animations and trigger the animations at the correct location I have added larger placement magnets to both the book and the magnifying glass object. These magnets hold the display in place as the animation occurs.

magnetAndSensor

 

Flaws

The most glaring flaw of the current design is that the Looking Glass never actually knows which page it is on. The magnetic field passes through all of the pages and so it is impossible to know what page the Looking Glass is actually on.

A solution would involve additional sensors. For example color sensors could sample the colors on the current page an take an educated guess as to which page the Looking Glass was over. I did test basic color sensing but did not get far enough with this project to add that feature.

Software

All of my code lives on Github. The main arduino file is StoryBoard.ino. Connecting to the arduino via USB will enable a calibration mode for the sensors.

Feedback

  • Comments
    • Form factor
      • Ideally the box would be smaller
      • Hallmark cards may be a good model
    • Leah Buechley at MIT has good examples of similar work
    • Jie Qi – http://technolojie.com/circuit_sketchbook/
    • Story Clip – http://highlowtech.org/?p=2923
    • Living Wall – http://highlowtech.org/?p=27
  • Alternate idea
    • Choosing where you go?
    • What about board and card games?
    • What if it had a wireless transmitter so you don’t know what it will show?
    • What about surprise?

Based on feedback this project has many possible future directions. As a first prototype this project is fine but further iterations will need to be smaller. This is well within the realm of possibility, especially if I create a custom circuit board. I would also like to add audio. If further prototypes can be made more robust I hope to make the Looking Glass available for the Carnegie Library of Pittsburgh.

Spencer Barton

28 Mar 2014

Looking Glass: Idea and Update

I want to explore non-linear storytelling. The reader will guide a character through the story using a display. The character will know where it is on the page and behave accordingly.

The user will control a see-through display that will have the character displayed. The display will know where it is over the page so that the character can be animated to interact with the page.

2014-03-28 13.44.03

OLED Diplay Tests

I got animations working on the OLED display. More details on this piece of hardware..

2014-03-24 16.02.02

In this case the character is the caterpillar from The Very Hungry Caterpillar by Eric Carle.

Moving Forward

The key details at this point are:

  • Localization – I’ve got a few ideas
    • Capacitive plate under a page with learning
    • Microphones attached to page to listen for movement with learning
    • Camera mounted above to track
    • IR grid/ id tags on the page
    • Look at color beneath display
  • The story – I can use an existing story or create a new one. I am able to develop a story idea but would not be able to create a story graphically. I need to talk to others in the class and see if anyone would be interested in working on this part of the project with me.
  • Building the module piece – The current OLED display is flimsy and uninteresting. The reader deserves to hold a more interesting object. I need to create an object that pertains to the reading process that will contain the display, a processor, batteries and sensors. As a result it will be a nice hand-sized object. As a default I 3D print an object with a form similar to a computer mouse.

I am currently exploring the color detection case. With 2 or more color sensors it would be potentially possible to know exact location. That said the sensors are noisy so detection will likely occur based on regions. For example saying the the caterpillar is over the leaf or over a blank part of the page. The key advantage of the color sensors is that the display can become a completely independent object. All of the other ideas involve modifying the book or setting-up external hardware. The OLED display enable the reader to still engage with the book in a semi-natural manner. External hardware makes the reading situation more contrived and forces the reader to conform to my project set-up.

This project may be nice for the Pittsburgh Children’s Museum if ruggedized enough.