“[Sound can be described as] oscillations of air which keep vanishing the moment it is produced, materiality at its most intangible and hence in it’s most tenacious form.”
– Madeline Dolar, A Voice and Nothing More, pg. 59.
In this collaborative work Andrew Bueno and Caroline Record explore themes of permanence and production by recording a sound walk (a record of simultaneous audio and GPS data) and playing it back on a machine, configured to precisely simulate their path and speed. Thereby, the machine performed the experience of the walk and created a physical artifact of it. We wanted to question what it meant for a machine to produce an experience rather than simply a product. This idea fits into our larger theme, which is directly discussed in the content of the sound walk and is evident in the senario presented in the video, of the dichotomies that exist within the artist as engineer.
Sound, as a medium, can create a window into reality (a quality usually associated with vision). It has a special intimacy, embodied in our willingness to let it constitute our own internal dialogue. The experience we craft is connotative of museum tours, listening to music, talking on the phone, or perhaps even of our own thoughts given voice.
Here, we wished to juxtapose presence and absence. Maps obey the same laws of semiotics as anything else. Take away the visual details, the keys and scales, and what is there? A signifier without its signs.
In this specific sound walk we explore the dichotomies that exist within the artist engineer’s approach and what is lost, or gained, in contrast to a traditional art practice.
Artistic influences include:
We faced a twofold challenge in attempting to make this project come to fruition. How could we record an audio track and regularly updated global positioning in an efficient manner? And, once we had this data, how could we translate it into directions a CNC Router could actually follow?
What ensued was an intense feeding frenzy of learning. We quickly acquainted ourselves with iOS app development, G-code documentation, and coordinate projection methods.
LittleBrother, our iOS app, is quick and dirty but gets the job done. The app uses the AV Foundation framework for audio recording, and the CoreLocation Framework to grab the GPS position of the phone. When recording, this GPS data is saved into a plist organized as a dictionary, where the keys are the time a pair of coordinates was recorded (in Unix time) and values are the coordinates themselves. All data is saved in timestamped directories in the Documents folder of the app. Thanks to code from Dan Wilcox and the CocoaHTTPServer library, the iPhone running the app can act as a WebDAV server. In this way, we access the files we have recorded for the next step.
GcodeCreator, a Processing sketch, takes things from there. Given a plist, the program rips all coordinates and their corresponding times into parallel arrays, converting the latitude/longitude coordinates to Universal Transverse Mercator (UTM) coordinates in the process. This is done so that our coordinates now lie in a plane instead of on a rough sphere. The UTM coordinates are then mapped to a 48 by 48 area (corresponding to the maximum size of a piece of wood the CNC can take). In a new array, the sketch uses the times attached to the points to calculate the speed necessary for the CNC to match how long it actually took for us to walk from one point to another in real life. In yet another array, these speeds are mapped to a depth cuttable by the router tool. With this information, the GcodeCreator then follows its namesake, creating a Gcode file that outlines how to draw our sound walk path.
Many Thanks Too:
Can Ozbay for his xcode expertise
Dan Wilcox for xcode tutelage and technical advice
Robb Godshaw for his help with g-code