Sayers-Final

Premise/Summary

I started with the idea that I was going to go to a graveyard to get scans of graves to turn into music.  This eventually evolved into scanning the Numbers Garden at CMU with Ground Penetrating Radar with thanks to Golan Levin, the Studio for Creative Inquiry, Jesse Stiles, and Geospatial Corporation.  I then compiled these scans into music which I then placed into a spatialized audio VR experience. 

Ground Penetrating Radar Overview

A short description is GPR works basically by reflecting pulses of radar energy that are produced on a surface antenna.  This then creates wavelengths that go outward into the ground.  If an object is below ground, it will bounce off that instead of merely the ground, and will travel back to the receiving antenna at a different time (in nanoseconds).  The most important type of data that you receive from GPR is called a reflection profile and looks like this:

Essentially by finding the aberrations in the scan, one can figure out where there were underground objects.

History of CMU/Scanning With Geospatial

One of the things that we scanned was the buried artwork Translocation by Magdalena Jetelová.  This was an underground room that was put underneath the cut in 1991.  I talked with the lovely Martin Aurand (architectural archivist of CMU) who told me some of the stories about this piece.  In the late 80s/early 90s, a CMU architecture professor that was beloved by many of the staff had died in a plane crash on her way to Paris.  To honor her, the artist Magdalena Jetelová created a room beneath the cut in a shipping container, with lights and a partition.  There was a large piece of acrylic on top of it so that you could actually walk around on top of it.  This artwork was buried somewhere around 2004 however, as water had started to leak in and ruin the drywall/fog the acrylic.  Most people on campus don’t know that it exists.

Another area that I explored was the area by Hunt Library now known as the peace garden.  This used to be a building called Langley Laboratory (although this was often labeled Commons on maps).  I went and visited Julia Corrin, one of the other archivists on campus to look through the archives to find old pictures of CMU.  One part of Langley Laboratory in particular caught my eye as it was a small portion that jutted off the end that appeared in no photographs except the aerial photos and plans.  Julia did not actually know what that part of the building was for and asked me to explore it.  After looking through the GPR data, I don’t believe any remnants of it remained.  It is likely that the building’s foundation was temporary/were completely removed for the creation of Hunt Library.

The last big area I explored was the Number Garden behind CFA.  This area was interesting particularly because Purnell Center is immediately below it.  This was particularly interesting to scan as we could see the ways that the underground ceiling sloped beneath the ground we were walking on/the random pipes and electrical things that were between the sidewalk and the ceiling.

The people at Geospatial were amazing partners in this project and went above and beyond to help me and our class learn a lot about GPR and its uses.

Hearing the Ground

After scanning, I used SubSite’s 2550GR GPR Software to get the scans from .scan files into a more standard image format.  I  went through them all and organized each swath of scans into folders based on what part of the scan it was, whether it was the shallow or deep radar scan, pitch range etc.  I then took the swaths and put them into photoshop and edited the curves and levels so that I could filter out most of the noise/irrelevant data.  I put these edited photos into a Max MSP patch.  This patch would take an array of pixels and depending on each pixels color/brightness, it would assign a pitch to that pixel. I did this for both the deep and shallow scans which I used as bass and treble respectively. I then combined all the different swath’s audio clips for the deep and shallow to make two separate pieces which I then joined together at the end in Audacity.

Spatialized Audio with Scans in VR

One of the later portions of the project was putting this audio into a spatialized VR project.  I used splines in Unity to map our path in the Numbers Garden.  I then attached the audio source to it so that the audio would travel through the Vive space in the same pattern as we did while scanning.  I put a cubes on the audio source so that it would be easier to find.  I created two splines (one for the bass/deep and one for the treble/shallow) and put them accordingly in the space with the bass being lower to the ground.  I then used the occulus audio sdk to make it so that the participant would be able to find the audio source merely by moving their head.  I finished by writing a few scripts that allowed me to have the scans as spinning, slightly pulsating skyboxes.  Another script changed the scans when that swath of the scan’s sound had ended.

Continuation

I am really hoping to continue this project over this coming summer and next year.  I hope that I can scan more places to hopefully create isosurface profiles.  These I could then use so that in every single place in the Vive area, there would be a separate sin wave that would correspond to the Isosurface model in a similar fashion.  By moving around the space, this would allow participants to physically hear when they are moving their head near an object.

sayers-ForCatalog

Title: GroundSounds

Description: GroundSounds is a VR tour through Carnegie Mellon’s Numbers Garden by way of spatialized audio created from Ground Penetrating Radar data.

Since this is PRIMARILY sound but also a bit of VR I am unsure of what exactly to put as my image.  This is some of my edited Data that I feed into the Max patch and becomes the skybox.

sayers-GPRData

As many of you know, I decided to explore ground penetrating radar heavily this semester.  I have been interested in geology combining with artwork for quite a while, although I’m not completely sure why.  A lot of my art has to do with stratospheres of soil/minerals and their makeup.  Although this is primarily a tool for archeology and civil engineering/construction, I found it gave me some really cool data.

I started with the idea that I was going to go to a graveyard to get scans of graves to turn into music (in the process of being set up with GeoSpatial).  Golan was kind of enough to get GeoSpatial to come to CMU campus though, and I decided to look for some of images/things on the campus that most people never see.  Definitely by far the coolest thing was the hidden artwork Translocation by Magdalena Jetelová.  This was an underground room that was put underneath the cut in 1991.  I talked with the lovely Martin Aurand (architectural archivist of CMU) who told me some of the stories about this piece.  In the late 80s/early 90s, a CMU architecture professor that was beloved by many of the staff had died in a plane crash on her way to Paris.  To honor her, the artist Magdalena Jetelová created a room beneath the cut in a shipping container, with lights and a partition.  There was a large piece of acrylic on top of it so that you could actually walk around on top of it.  This artwork was buried somewhere around 2004 however, as water had started to leak in and ruin the drywall/fog the acrylic.  Most people on campus don’t know that it exists.  We were lucky enough to get a scan of this area which went in a grid-like pattern so that I can turn it into an isosurface rendering (more on this later).  

Another area that I wanted to explore was the area by Hunt Library now known as the peace garden.  This used to be a building called Langley Laboratory (although this was often labeled Commons on maps).  I went and visited Julia Corrin, one of the other archivists on campus to look through the archives to find old pictures of CMU.  One part of Langley Laboratory in particular caught my eye as it was a small portion that jutted off the end that appeared in no photographs except the aerial photos and plans.  Julia did not actually know what that part of the building was for and asked me to explore it.  After looking through the GPR data, I don’t believe any remnants of it remained.  It is likely that the building’s foundation was temporary/were completely removed for the creation of Hunt Library.

The last big area I wanted to explore was the Number Garden behind CFA.  This area was interesting particularly because Purnell Center is immediately below it.  This was particularly interesting to scan as we could see the ways that the underground ceiling sloped beneath the ground we were walking on/the random pipes and electrical things that were between the sidewalk and the ceiling.

I also did a lot of research on how GPR works, particularly on the hardware portion and what antennas to use etc.  A short description is GPR works basically by reflecting pulses of radar energy that are produced on a surface antenna.  This then creates wavelengths that go outward into the ground.  If an object is below ground, it will bounce off that instead of merely the ground, and will travel back to the receiving antenna at a different time (in nanoseconds).  There are 2 main important types of GPR images.  The first thing is a reflection profile.  This is the long image that you do an immediate scan.  This will show the bumps in the ground and look like this:

The next is an isosurface rendering.  This is basically what happens if you get a lot of scans in a grid.  If you line up a bunch of the scans, you essentially get horizontal slicing of what you can turn into a 3D model.  This looks something more like this:

In some ways, as far as events, my event was helping to get GeoSpatial involved, doing research to find interesting places to go to, learning a lot about GPR to ask educated questions, and then having the day that we scanned.  The act of scanning itself is an event which can also be captured.

Because the data was slightly difficult to read at first (Thank you Golan for going through it with me through strange photoshop raw files with guessing bits) and getting very sick, I am slightly more behind than I would like.  I have the data and will be meeting with Jesse Styles on Tuesday to get opinions on how I could turn this into a 3D soundscape.  This is a very difficult project for me because it is big, involves people outside of CMU, and every part of it is completely out of my normal wheel-house.  My next big difficulty is going to be learning how to synthesize this into sound, as I very rarely work with it.  I feel like I am still learning a lot throughout this though.  I really want to thank GeoSpatial for being so kind and sharing their time and software with us!

Golan also showed me this super cool artwork made by Benedikt Gross in which he uses computational tractors to create enormous earthworks.  These tractors/bulldozers can be programmed to go in set patterns and can act like a 3D printer/CNC router! 

If you are interested in seeing any of the raw data, reach out to me.  I cannot unfortunately share the SubSite software as Google Drive will only allow me to share it with people at GeoSpatial.

sayers-Event Proposal

For my event project, I would like to focus on the event of rain and the ideas of erosion that come from it.  I don’t know why I have a thing about rocks/the underground.  I would like to create a custom-controller game that would be played by the weather.

Game will be played on cellphone with custom interface.  Small sensor that attaches to phone (with waterproof phone case).  Sensor will be a small square with four quadrants.  The sensor will pick up if a raindrop hits in one of the four quadrants.  The game will also get your GPS coordinates and check online to see if it is raining (no cheating with an eyedropper).  If a raindrop hits one of the four quadrants, the in game raindrop will explore in that direction.  One could try to watch the rain drops and move your phone so that it lands on the desired quadrant, or could let nature do it’s thing. When enough raindrops hit a quadrant, it will begin to disintegrate on the screen, uncovering objects in the strata of sediment.

***********************************************************

Other Idea:

Surfing-type game that you play as a child in the car.

Pressing at Jump points or else crash.

Holo-lens project using edge detection.

UPDATE:

I have gotten a cube to appear on the Hololens and now understand much more of how to develop for that.

I have also been doing mostly research on how I might do edge detection quickly in the Hololens.  It doesn’t seem like there is one clear option.  The main thing that it seems the Hololens uses is spatial mapping (specifically I could use the low-level unity spatial mapping api).  This is very computationally intensive though, and I believe would probably only work in an already mapped area (so not out of a car window).  The other option that I could explore would to be get the camera feed out of the Hololens then put it into a processing/openframeworks sketch that would give me the coordinates of the edges in a silhouette (using some kind of edge detection for video).  I would then have to send the data back to the Hololens and compute where the figure should be.  Also since this is mixed reality, everything would have to be happening in real time with next to no lag.  I’m not completely sure if I have the technical abilities/the technology is there yet to get this done quickly and efficiently.

If this proves too difficult, one thing I may do is use Vuforia within the Hololens to create small creatures/people hanging from street signs.  For example, if I saw a stop sign, Vuforia would know oh this is the general shape/look of a stop sign and would then attach a 3D model (in various forms) to the sign.  This is also creating a little animate world.

Can I screenshot the holograms in the Hololens?

sayers- Place Project Proposal

Ground Penetrating Radar in Cemeteries

When we first saw the ground penetrating radar (GPR) in class, I knew that I wanted to experiment with it.  GPR doesn’t only just tell you about a place, but also the history of that place through the stratospheres of soil. It gives more context to a place, by showing the evolution of a place through objects buried underground.  I think in particular though, exploring this could be interesting because here we can see the objects spatially in relation to one another without disturbing them.  While archeology is interesting, one thing that is often lacking for me is the context of how these things were found and where in relation to other historical objects found in similar areas.

One thing that I found particularly interesting during my research is that you can create 3D models from GPR.  GPR usually creates a 2D slice of what is the ground by sending a radar wave underground and seeing how the refractions react.  By moving the radar slowly horizontally however and taking more pictures and combining all the data, you can get a 3D model.

I have possible 2 ideas that I would like to pursue with this in mind:

  1. Harmony Society Cemetery- As a child who grew up around Pittsburgh and in particular the area of Harmony, I learned the story of the Harmonites.  They were a small religious group that started in the 1800’s and believed that the second Coming of Christ was coming during their lifetime.  Since they believed that Christ was coming during their lifetime, they required celibacy but did not really do much converting, they died off rather quickly.  Because they believed that Christ was coming soon, their cemetery is a walled off grass field that has no headstones as they believed the graves to be a temporary arrangement.  I grew up passing this in the car and wanting to visit it because of the story and the enormous revolving stone door.  For me, I would love to see how the unmarked cemetery looks unground so that I could get more of the story.  I am unsure how I would present this but probably as some form of 3D annotated(?) map using something like open frameworks.  
  2. Last Songs- I also think it could be very interesting to see/hear what the dead sound like.  Because I would be gathering signals from the GPR, I could put these into a program like Max MSP or Pure Data and synthesize what the maps would sound like.  Because the signals change when there is something there, I would essentially be able to hear where objects were (for examples bones).  I think it could be particularly interesting to do this for deceased musicians, as then their coffins/bones/resting place would become their last song. I would probably try to show this in a head-tracking VR scenario (possibly the vive).  I would create a unity project that would play the synthesized audio for wherever you were in the 3D space.  I could also make a representation of what the underground 3D map looked like to allow people to have a visual as they move throughout the space.  Using the Vive would be particularly interesting for me as I feel as though I could possibly get a one to one relationship between ground space and VR area (Vive areas and grave space are relatively similar).  Although this would technically be  a place, this also creates a portrait of the deceased person below.  If I were to do this, I would probably start with Stanley Turrentine or Lillian Russel as they are both musicians at Allegheny Cemetery (close by).

sayers-Portrait

I decided for my portrait to create a maze out of my subject’s fingerprint.  I took her fingerprint by a normal ink pad and paper.  I then scanned it at a fairly high DPS.  I took that and slowly in Rhino traced with vector every single line and mark. After making all of these closed curves and giving an outside edge, I extruded these curves as a group to create a maze like design.  I had wanted the floor to be curved on the bottom so it would look like you were in the groove of the fingerprint, however this was much more difficult than I had thought.  Traditional lofting and filleting were behaving strangely on this many objects.  After spending a long time on this, I decided that I could get a more wavy effect in other ways, so I decided to use a vert shader that twisted the view of the camera, instead of changing the mesh.  I made the floor white while the ceiling black, so that it would be like if you were stuck inside of the actual ink print.

I tried multiple different techniques before this, including doing some using the ink scans as height maps to put into the terrain editor and doing various simple image processing sketches in processing.  I considered doing photogrammetry, but I wanted to explore methods of capturing that I hadn’t tried yet.

I was primarily inspired by the idea of hedge mazes and the lines of sand dunes.  I wanted to pose the question of could a fingerprint be a solvable maze? What would be the goal? To get to the outside or the inside?

Although this project is not what I had planned, I think it may get interesting results.  I am not very good at judging success immediately after making something.  I need time to process it.

I wanted to explore a very personalized thing (a fingerprint) is actually very abstract and how doesn’t truly tell a lot about who the person is. It is a portrait, but it tells you almost nothing. It is completely unique to my subject, just as her fingerprint is, although this could be recreated for anyone (similar to the normal process of fingerprinting).

Mac Download

PortraitPlan-sayers

My general plan is to allow people to walk around / fly around their fingerprints.  I find looking at people very closely (as in observing their pulse on their skin type close) can really create a more intimate experience between people. Generally people only observe one another at this scale if they are extremely emotionally close, so learning about a person like this will be a really interesting experience.  By making this into a landscape however, I put distance between myself and the subject.  I possibly will make it into a dessert with the ridges being sand dunes (shaders?).

I possibly can get a fingerprint on glass and go back to the SEM or use photogrammetry.  Also I may simply need to just squish a finger onto a scanner, as I just really need a heightmap that I can then use in the terrain editor to at least get the base effect that I want.

sayers-SEM

I decided to scan a small piece of frit (glass shard).  The SEM machine was malfunctioning (joystick not working to move around the piece) while it was my turn, but I still got a few interesting images out of the process.  I feel like I learned more about what a SEM machine is and also just how cool things look when you zoom in close.  Even the most mundane objects look fascinating.  I really enjoyed looking at the stress marks where the glass was broken.  It gives almost a history of the piece of glass when you get that close, as the stress marks can give information to speculate on how the piece was broken and by what.

Sydney-About Me

Hi everyone,

I’m a Junior Art Major with a Game Design Minor.  I primarily create my work using physical computing, various VR/AR programs, and Unity.   Come talk to me about any of those!

-Syd