TocattaTouch allows the user to manipulate a simulated sheet of fabric, applying forces to drag it through the space in which it resides, producing audio feedback to the user’s actions at the same time. I find this project interesting because it enables interaction in an imprecise manner rather than the typical precision involved in computing. The use of audio feedback continues in this vein, providing only a sense of what is happening to the cloth rather than precise measures. In its present form, TocattaTouch seems to operate in a specific range of parameters on a single cloth object. As a result, this project feels more like a conceptual demo than an idea-generating tool. I could see a more fleshed-out iteration gaining traction as a conceptual design tool.
Photon (Jayson Haebich)
Photon explores the transformation of 2D information into an ephemeral 3D object which is nonetheless responsive to real objects in its vicinity. This project doubtless drew inspiration from Anthony McCall’s 1973 Light Describing a Cone, in which a film projector was used to project a cone into a foggy room. Unfortunately, Photon does not make a more significant departure from the earlier piece, which is disappointing given the strides made in the underlying technology since then. The cone’s only response to objects intersecting its surface is to split away from them, reinforcing its untouchable nature. It would be interesting to see Haebich explore more complex responses to intrusion in the work, which could add personality to a light object which at present seems rather rigid.
Painting with a Digital Brush (Teehan+Lax)
Painting with a Digital Brush is an extension of a longstanding field in computer art: text-mode. A painter using white paint on a black canvas is replicated in real time by the software to produce the ASCII-art rendering of the painting, which is then overlaid onto the original by a projector. This blurring of the distinction between working in the real world with traditional materials and producing a work digitally is intriguing, as is the notion that only through the (comparatively) vast computing resources of today, have we become able to live-generate works in a simple art medium that hasn’t been in widespread use for decades.
Daily Demo: Lick Weed (Brainstorm)
While not written in openFrameworks, Lick Weed does give a good introduction to the modern text-mode demo. All of the objects and effects in the video are generated in real time as the demo is running, and then rendered into terminal-printable characters for display. Text-mode demos began on the earliest personal computers, some of which had no graphic capabilities except for writing text to the screen. However, due to the requirements of converting to text-mode rendering and lower resolution, these demos lack the complexity typical of modern demos. Nonetheless, the complex reflections and distortions seen in Lick Weed represent a significant step forward in this field.