Weija-PlaceProposal

My vision for this project is to take nice “holographic” photos of objects that would represent my room. The idea for my place was inspired by Golan’s lectures regarding the artists who made the light art using long exposure photography. My idea is to certain objects in my room that are “distinguishable”, so they can symbolize my room. The technique I want to use is to capture the objects using photogrammetry, render them in unity, and render their subsections like the people in the Making Future Magic Vimeo:

Ideally, once I get this process down, it should be easier to reproduce it on multiple objects. It would be nicer to get it into code so that it could be a black box type of machine, but given the time constraints I’ll probably just do it all by hand.

supercgeek-PlaceProposal

Virtual/Augmented/Mixed Reality Historical Exploration of Carnegie Mellon University

HistoryPin Android App

I’ve been itching to do something with mixed reality methods where I can really focus on execution and craft in either VR or AR/MR. I’m not sure yet if I’ll explore full-presence with live video (i.e. hololens) or something where I pre-record a number of locations with 360 video, map into VR (i.e. rift/vive) and then place the imagery above. The main-stay of the idea is that I’ll mix historical photographs with VR/AR/MR exploration with time-rift effect to start/end viewing the images.

Historical Material

I’ve been really inspired by looking at plenty of the historical photos from around CMU, I definitely think I’ll have far more material than I actually have time to work with.

Reality Tear Effect

I’ve been looking into different ways to ‘tear’ into the historical layers of reality. I really liked the example from New York Times 360 on Malcolm X, but it doesn’t directly address the issue of having a dynamic viewpoint (i.e. the imagery is painted and placed around the camera).



 

Other References

 

*=thanks to austin lee for references

mikob – Place Proposal

I have a couple of rough ideas in mind for the place project.

1. Map of password security questions

Password security questions often ask the question related to “where” (e.g. Where were you when you had your first kiss? In what town or city did your mother and father meet? ) These security questions are designed to be the most intimate and personal questions that would be impossible to hack, so I thought it would be interesting to make a simple application that creates a personal security map based on the questions answered.

2. Mini dollhouse of a room

As a child, we all have played with these miniature doll houses. Looking back at it now, I’m thinking it might be interesting to 3d scan my room or some other place and 3d print it to preserve it as an artifact of the place I resided in.

3. In terms of capturing a unique place, La Hutte Royal was one of the most mesmerizing and fascinating “places” that I’ve been to (and also less known). I’m not sure what would be the best method to capture this place, but it’s definitely a great to look into.

Bernie-PlaceProposal

For my Place project, my difficulty is that the robot arm can’t really go anywhere besides the studio.  So I have to somehow bring a place to the arm.

I think it would be interesting to generatively have the arm light paint different “versions” of the northern lights.  I could start by attaching a line of about 8 RGB LEDs to the end of the arm that I change colors and fade in and out as I paint the northern lights.

My plan to start is to make an OF app that I can draw a squiggle on, and the arm will take a diagonal trajectory along the squiggle fading the top lights and twisting randomly to create generative light paintings of the northern lights.

Quan—PlaceProposal

I have not yet reached a definitive conclusion for what place i want to represent nor the method I want to employ to portray it. I suppose the method should be decided based on the place itself, but I have a few ideas for both that I want to try out.

Place—

Physical Places:

  • Church/ University Center Chapel. I come here a lot and this room holds great meaning to me. I want to capture the essence of calmness and peace that this place gives me.
  • Emergency Room. I was recently at the ER, and noted that it had a very peculiar environment, and the interactions that happen there seemed very odd, and I would like to capture that oddity in some way.
  • Margaret Morrison/ Design Studio. For obvious reasons, but I would probably focus on the competitive culture.
  • Room/ Drawers/ Bookshelf. I think my bookshelf is very important to me, and I think it would be interesting to understand certain underlying patterns that bring all these seemingly random books to the same collection.

Abstract “Places”:

  • Time. I think it would be interesting to see how we perceive time as a place. I think there could be an interesting abstraction when we visualize time as a volumetric space, with certain hours being more important than others, or even representing time as greater than just 24 hours.
  • Conversation. I wonder what conversation as a place would look like.
  • Broken Hand. I recently injured my hand, and I was able to snag the X-Ray files, and I wonder if there are any cool things I could do with those.

Methods—

  • Steel Wool Photography. I’ve done this in the past, and I wonder if I can expand its usage to represent the space it’s performed in. The particles fit the trajectory of the swing, so if the swing is calculated, we could highlight/hide certain aspects of physical space.
  • I am interested in creating a similar light pole as the one that visualized Wi-Fi signals.
  • I am interested in creating a 3D light painting, perhaps as a method of 3D annotation.
  • I’d be interested in taking a stab at doing something with Light-Field cameras.
  • I want to learn how to use Computer Vision as a medium—I don’t completely understand the lengths of its capabilities.

sayers- Place Project Proposal

Ground Penetrating Radar in Cemeteries

When we first saw the ground penetrating radar (GPR) in class, I knew that I wanted to experiment with it.  GPR doesn’t only just tell you about a place, but also the history of that place through the stratospheres of soil. It gives more context to a place, by showing the evolution of a place through objects buried underground.  I think in particular though, exploring this could be interesting because here we can see the objects spatially in relation to one another without disturbing them.  While archeology is interesting, one thing that is often lacking for me is the context of how these things were found and where in relation to other historical objects found in similar areas.

One thing that I found particularly interesting during my research is that you can create 3D models from GPR.  GPR usually creates a 2D slice of what is the ground by sending a radar wave underground and seeing how the refractions react.  By moving the radar slowly horizontally however and taking more pictures and combining all the data, you can get a 3D model.

I have possible 2 ideas that I would like to pursue with this in mind:

  1. Harmony Society Cemetery- As a child who grew up around Pittsburgh and in particular the area of Harmony, I learned the story of the Harmonites.  They were a small religious group that started in the 1800’s and believed that the second Coming of Christ was coming during their lifetime.  Since they believed that Christ was coming during their lifetime, they required celibacy but did not really do much converting, they died off rather quickly.  Because they believed that Christ was coming soon, their cemetery is a walled off grass field that has no headstones as they believed the graves to be a temporary arrangement.  I grew up passing this in the car and wanting to visit it because of the story and the enormous revolving stone door.  For me, I would love to see how the unmarked cemetery looks unground so that I could get more of the story.  I am unsure how I would present this but probably as some form of 3D annotated(?) map using something like open frameworks.  
  2. Last Songs- I also think it could be very interesting to see/hear what the dead sound like.  Because I would be gathering signals from the GPR, I could put these into a program like Max MSP or Pure Data and synthesize what the maps would sound like.  Because the signals change when there is something there, I would essentially be able to hear where objects were (for examples bones).  I think it could be particularly interesting to do this for deceased musicians, as then their coffins/bones/resting place would become their last song. I would probably try to show this in a head-tracking VR scenario (possibly the vive).  I would create a unity project that would play the synthesized audio for wherever you were in the 3D space.  I could also make a representation of what the underground 3D map looked like to allow people to have a visual as they move throughout the space.  Using the Vive would be particularly interesting for me as I feel as though I could possibly get a one to one relationship between ground space and VR area (Vive areas and grave space are relatively similar).  Although this would technically be  a place, this also creates a portrait of the deceased person below.  If I were to do this, I would probably start with Stanley Turrentine or Lillian Russel as they are both musicians at Allegheny Cemetery (close by).

fourth-PlaceProposal

When I think of the places I go during a day, one doesn’t quite qualify as a place: “commute”. In my head it’s a single place, but it might better be categorized as the place between places.

For my project, I would like to create a single photo of my commute. This image would have to be very long, achieved through stitching software. I would use a GPS-connected intervalometer, to automatically take images while I travel x feet. and a rig to keep the camera at a consistent height  and direction while I travel; probably attached to a bicycle or simply a monopod strapped to my backpack. If the rig is successful, I would like to create a few of these images.

Long panoramas have been achieved before and are nothing particularly special. I want to make my panorama… longer than that. A single image as long as possible. The aspect ratio should be absurd.

I hope through a refined rig I can capture something unique and impressive, creating a single high quality stitched image that captures a scene for over a mile.

For delivery, I would like to print the image a few inches high, however many feet long, on a single strip of roll paper I would purchase.


The second idea I am playing with is going on a hike and using satellite-stitching techniques, and a microscope, to create an ultra-high resolution image of the boots after the hike; see what sorts of stories are hidden in the parts of places that get stuck to our shoes.

DMGordon-PlaceProposal

I want to use point cloud data to describe a space in multiple moments simultaneously. I can do this using Unity as a display environment, and a Kinect as the tool of capture. I want to capture a couple spots of constant motion that are dear to me. One is the freeway next to my house, and the other is a creek in Frick park. The code portion is based off of a previous project I’ve done, so it will not take as long as it would were I starting from scratch.
The main problem I’m dealing with is how to get the Kinect to the locations I want to capture, which are far from any electrical outlet. I will most likely end up making a DIY battery.

caro-PlaceProposal

Echolocation Depth Map of the Electrical Engineering Lab

We experience and understand environments primarily using the sense of sight. But what if we could see using sound? I was inspired by bats, who can get around using high frequency sound waves to describe their environments, without being able to see. I was also inspired by Ben Snell’s LIDAR project, where LIDAR is a technology that measures distance by illuminating it’s target with a laser light. In a way, I’m sort of attempting to create my own Echolocation distance sensing LIDAR. The location I’ve chosen is the Electrical Engineering lab at CMU, which is very personal to me, I’ve spent many long hours there.

 

Turns out, there’s this thing called an Ultrasonic Sensor:

It emits a high frequency noise, and then waits for the sound to come back. Based on this information, it can tell how far away something is.

Giant 2×4 covered with Ultrasonic Sensors

I hypothesize that by covering a big stick with Ultrasonic Sensors, I can construct a rough depth map of the EE lab. I want to do this with the lights off.

By placing the sensors at regular intervals, I know the location of the sensor in the height direction (y). If I stand in one spot with the pole, I know where I am on the floor (x). The only other questions is “how far away is all the other stuff?” (z).

I think that if I spin around in a circle, and time how long it takes me, I’ll be able to create a 180 degree image of the lab (basically a cylinder). I bet there’s a way to do this more precisely with motors but honestly I’ll probably just wind up spinning around in a circle.

But will the sensors interfere with each other? No because I’m gonna do math and make that not happen

Data Visualization

I’ll have all of this data about the room, but it’ll still be spaced out a lot because of how far apart the sensors are. I’ll write processing code to interpolate between the sensor values, so I get a smooth depth map. It won’t be hyper-accurate, but it’ll give a vague sense of the location.

Ultimately I’d like to put the 180 degree photo in a cardboard or some sort of viewer, so people can experience the room by using sound technologies to create a “physical” experience.

 

blue-PlaceProposal

I am interested in intimate spaces. I’m interested in places where we stash things, hide things, forget things. I’m interested in the topography of our top drawers.

Gaston Bachelard wrote, in his book The Poetics of Space (1957/1964), about the “topography of our intimate being.” He wrote about the phenomenology of attics, basements(cellars), nests, cabinets, drawers, and the house as a whole.

I would like to create a system with which to immerse a viewer in the landscape of someone’s top drawer. This could be a top drawer of a desk, or dresser, or kitchen. I know, personally, that the back of the top drawer is often a place of hidden things, and whenever I move homes (which in the past decade has been all too often), I am always surprised at what I find there when packing up my belongings.

Technically, I am interested in figuring out a way to create a 3D scan of the contents and space of a top drawer, and I would like to place this in a 3D viewer, ideally a Google Cardboard, where a person could have an intimate, immersive encounter with this often overlooked but richly revealing space.