“26-track Sequester” is a compositional tool that organizes sounds in terms of text. It uses a keylogger to record the sounds one makes while typing. In the process, it associates those sounds with the letters typed: A to the clack of a keystroke, B to an exasperated sigh, etc. To start 26-track Sequester, position its transparent area above a region of text. It will detect the text, and play back the sounds associated with each letter. There are + and – buttons to control the speed of playback.
The addons were necessary to house the device in a transparent window, to log systemwide keystrokes, to record and play sounds, and to detect text using optical character recognition.
It might just be me so far, but I didn’t really have a good experience using OpenFrameworks. It seems to me that almost nothing works! Documentation to install OpenFrameworks and getting it running on Windows is sparse at best, and documentation for the contributed Addons are virtually nonexistent. Compile times are long, even to tweak minor settings, and the output log is hard to understand. I hold much respect for the people that make and maintain OpenFrameworks for the world to use for free, but personally I have to wonder, why on earth would I ever use OpenFrameworks when software like Processing is around?
Anyways, the two libraries I chose were the ofxFlock and ofxBlur libraries, by mummey and kylemcdonald, respectively.
I originally wanted to use the ofxASR addon by kitschpatrol with the ofxExplodingString by armadillu to make a project that hears what you say, prints your speech, and then explodes it. However, trying to install either of them was really, really difficult and I couldn’t get it working. ASR wants a bunch of other libraries installed to work with it, and ExplodingString expects an iPhone (which I don’t have). So, I went with the Flock and Blur libraries primarily because they worked on Windows without much hassle. In any case, we have a working project with an blurry flock of triangles.
My original intention for this project was to get a head start on the hardware platform I wanted to use for my final project. (see info viz sketch) The first steps would entail combining Kyle Mcdonald’s ofx CV and the ArDrone add on. This would enable the drone to have a more precise self awareness and guidance protocol. The end goal would be to use a cv frame to guide a drone to specific points within that frame.
The first challenge was to get the add ons to compile on their own. There were some disparities in software versions which made this step alone, quite difficult. Once this was achieved a relative bare bones combination was compiled with relative success. However, there were many remaining hardware and communication protocols that were left to be resolved close to the due date so the project was shifted.
// ofx CV + ofx rollCam
The next iteration was an attempt to combine ofxCV with a screen based output instead. The process of creating a compilable version went a bit more smoothly, however this app was much more dependent on my knowledge of C++. The main challenges included deciphering the correct protocol for blob tracking within ofx CV. With the introduction of my own code the compile ability of the app began to suffer. Currently the app does not yet function, although it is getting close.
I wanted to get some voronoi action going on with walls that jittered as though there was an electric current. Beyond that I wanted to place objects of varying opacities within the voronoi cells and watch them bounce around and hit the walls and stuff. It could have been cool and simple. Unfortunately, I was having trouble compiling anything, even example code, and I was not able to go to the help session yesterday so I hardly got any progress for this one unfortunately :(
Here is a sketch of what I had in mind. Will post the finished results later on.
My parametric object became sort of a hybrid of the OF Addons assignment and the Parametric Object assignment, so I decided to do something a little simpler for my OF app which was to finish a project I had started earlier.
Last semester I was working on a system for Youtube users that adds a virtual Minecraft-themed mask to hide their faces. I got this idea after noticing how different the attitude towards sharing personal information online has become – when I was a kid on the internet, all of my friends were very careful on the internet, being wary of strangers/what we share, never even sharing our real names.
The app hides your face, but in an expressive/more ‘cool’ way than just pixelating/blurring it – you can import your minecraft character and use its head as your mask. I thought that maybe this program would help to bring back the feeling of wanting to be anonymous on the internet, which I believe is important in an age of mass data collection and surveillance on the internet.
The original version required the user to record their own screen, and could only use the default Minecraft character’s head. This version uses ofxVideoRecorder to record and save .mov files and also lets users user their own Minecraft skin.