{"id":423,"date":"2023-10-23T05:48:26","date_gmt":"2023-10-23T05:48:26","guid":{"rendered":"https:\/\/golancourses.net\/fall23\/?page_id=423"},"modified":"2023-10-25T02:37:29","modified_gmt":"2023-10-25T02:37:29","slug":"full-body-interactive-art","status":"publish","type":"page","link":"https:\/\/golancourses.net\/fall23\/daily-notes\/october\/10-23\/full-body-interactive-art\/","title":{"rendered":"Full-Body Interactive Art"},"content":{"rendered":"<p><strong>Erkki Kurenniemi<\/strong>,\u00a0<em>DIMI-O<\/em>\u00a0(1971)<br \/>\nUsed the body, as tracked by the camera, as the basis of a\u00a0musical instrument.<br \/>\n<em>Dimi-O (1971) is based on an optical interface, the purpose of which was to read sheet music graphically. The instrument was played by means of a video camera. DIMI-O was also performed by a dancer, whose movements were transformed into music.<\/em> [<strong><a href=\"https:\/\/www.youtube.com\/watch?v=d-yHULQ2V5c\">YouTube<\/a><\/strong>]<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/erkki_kurenniemi_dimi_o.gif\" \/><\/p>\n<p><strong>Myron Krueger<\/strong>,\u00a0<em>Videoplace<\/em> (1974-1989)<br \/>\nKrueger represented the body in a virtual environment, endowed with new powers. This is not only some of the first full-body computer-based interactive art to use a camera; it is some of the first interactive computer art, period,and some of the first telematic (networked) art as well. Krueger sought to make computing a full-body activity. Keep in mind that in 1974, <em>the mouse<\/em>\u00a0had not even come into widespread use.<br \/>\n<em>Two people in different rooms, each containing a projection screen and a video camera, were able to communicate through their projected images in a \u201cshared space\u201d on the screen. <\/em>[<strong><a href=\"https:\/\/www.youtube.com\/watch?v=dmmxVA5xhuo\">YouTube1<\/a><\/strong>, <strong><a href=\"https:\/\/www.youtube.com\/watch?v=WAA9uYxgSbg\">YouTube2<\/a><\/strong>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/myron_krueger_videoplace_critter.gif\" width=\"351\" height=\"266\" \/> <img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/myron_krueger_videoplace_fingeranimation.gif\" width=\"354\" height=\"266\" \/><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/myron_krueger_videoplace_drawing.gif\" width=\"708\" height=\"442\" \/><\/p>\n<p><strong>David Rokeby<\/strong>,\u00a0<em><a href=\"http:\/\/www.davidrokeby.com\/vns.html\" target=\"_blank\" rel=\"noopener\">Very Nervous System<\/a><\/em>\u00a0(~1986-1990)<br \/>\nAnother use of the body in a musically instrumental way,\u00a0but Rokeby eliminates the screen entirely, and instead surrounds the body by a tightly\u00a0responsive sound-environment.<\/p>\n<blockquote><p><em>\u201cI created the work for many reasons, but perhaps the most pervasive reason was a simple impulse towards contrariness. The computer as a medium is strongly biased. And so my impulse while using the computer was to work solidly against these biases. Because the computer is purely logical, the language of interaction should strive to be intuitive. Because the computer removes you from your body, the body should be strongly engaged. Because the computer\u2019s activity takes place on the tiny playing fields of integrated circuits, the encounter with the computer should take place in human-scaled physical space. Because the computer is objective and disinterested, the experience should be intimate.\u201d<\/em><\/p><\/blockquote>\n<p><iframe loading=\"lazy\" src=\"https:\/\/player.vimeo.com\/video\/8120954\" width=\"620\" height=\"411\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><strong>Rafael Lozano-Hemmer<\/strong>,\u00a0<em>Surface Tension<\/em>\u00a0(1992)<br \/>\nPosition is mapped to position: simple and compelling. [<strong><a href=\"https:\/\/www.youtube.com\/watch?v=JXLoLPkzdto\">YouTube<\/a><\/strong>]<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/rafael_lozano_hemmer_surface_tension.gif\" \/><\/p>\n<p><strong>Camille Utterback &amp; Romy Achituv<\/strong>,\u00a0<em>Text Rain<\/em>\u00a0(1999)<br \/>\nThe body surrounded by responsive virtual objects. [<strong><a href=\"https:\/\/www.youtube.com\/watch?v=f_u3sSffS78\">YouTube<\/a><\/strong>]<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/camille_utterback_romy_achituv_text_rain.gif\" \/><\/p>\n<p><strong>Daniel Rozin<\/strong>,\u00a0<em>Wooden Mirror<\/em>\u00a0(1999),\u00a0<em>Peg Mirror<\/em>\u00a0and\u00a0<em>Weave Mirror<\/em> (2007) [<a href=\"https:\/\/www.youtube.com\/watch?v=i-G54kVrhbE\">YouTube<\/a>, <a href=\"https:\/\/vimeo.com\/12055021\">Vimeo<\/a>]<br \/>\nLiteral mirrors\u2013but with sculpturally-expanded concepts of pixel-based screens.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/daniel_rozin_wooden_mirror.gif\" width=\"480\" height=\"276\" \/><\/p>\n<p><strong><a href=\"http:\/\/www.snibbe.com\/projects\/\" target=\"_blank\" rel=\"noopener\">Scott Snibbe<\/a><\/strong>,\u00a0<em>Boundary Functions<\/em>\u00a0(1998)<br \/>\nThe\u00a0<a href=\"http:\/\/www.raymondhill.net\/voronoi\/rhill-voronoi.html\" target=\"_blank\" rel=\"noopener\">Voronoi plane\u00a0partitioning algorithm<\/a> is used to illustrate personal space. [<strong><a href=\"https:\/\/www.youtube.com\/watch?v=_Ax4pgtHQDg\">YouTube<\/a><\/strong>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/scott_snibbe_boundary_functions.gif\" width=\"480\" height=\"368\" \/><\/p>\n<p><strong>Brian Knep<\/strong>,\u00a0<em>Healing Series<\/em>\u00a0(2003-2009)<br \/>\nWhere Snibbe explores the meanings made when people interact with the Voronoi algorithm, Knep explores the expressive potential of the body an an input to a <a href=\"https:\/\/pmneila.github.io\/jsexp\/grayscott\/\" target=\"_blank\" rel=\"noopener\">reaction-diffusion<\/a>\u00a0algorithm. [<strong><a href=\"https:\/\/www.youtube.com\/watch?v=PX5kKxH8hLk\">YouTube<\/a><\/strong>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/brian_knep_healing_pool.gif\" width=\"480\" height=\"350\" \/><\/p>\n<p><a href=\"http:\/\/www.flong.com\/projects\/messa\/\" target=\"_blank\" rel=\"noopener\"><strong>Tmema,\u00a0Blonk + La Barbara<\/strong><\/a>,\u00a0<em>Messa di Voce<\/em>\u00a0(2003)<br \/>\nBodies (and voices) interact with simulations to produce both sound and image. [<a href=\"https:\/\/vimeo.com\/2892576\"><strong>Vimeo<\/strong><\/a>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/messa_di_voce.gif\" width=\"480\" height=\"270\" \/><\/p>\n<p><strong>Chris O\u2019Shea<\/strong>,\u00a0<a href=\"http:\/\/www.chrisoshea.org\/hand-from-above\"><em>Hand from Above<\/em><\/a>\u00a0(2008)<br \/>\nA giant hand that plays with you on the street. [<a href=\"https:\/\/vimeo.com\/7042266\"><strong>Vimeo<\/strong><\/a>]<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/chris_oshea_hand_from_above.gif\" \/><\/p>\n<hr \/>\n<h2>Shadow Play<\/h2>\n<p><strong><a href=\"http:\/\/www.snibbe.com\/projects\/\" target=\"_blank\" rel=\"noopener\">Scott Snibbe<\/a><\/strong>,\u00a0<em>Make Like a Tree<\/em> (2005) [<a href=\"https:\/\/www.youtube.com\/watch?v=CPFF3-di2PU\"><strong>YouTube<\/strong><\/a>]. Check out Snibbe&#8217;s <a href=\"https:\/\/www.snibbe.com\/art\"><em>Screen Series<\/em><\/a>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/scott_snibbe_make_like_a_tree.gif\" width=\"480\" height=\"301\" \/><\/p>\n<p><strong>Rafael Lozano-Hemmer<\/strong>,\u00a0<em>Underscan<\/em> (2005) [<a href=\"https:\/\/www.youtube.com\/watch?v=Bfn14sLJmyU\"><strong>YouTube<\/strong><\/a>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/rafael_lozano_hemmer_underscan.gif\" width=\"480\" height=\"384\" \/><\/p>\n<p><strong>Philip Worthington<\/strong>,\u00a0<em>Shadow Monsters<\/em> (2005) [<a href=\"https:\/\/www.youtube.com\/watch?v=ShHQHAlZ7fA\"><strong>YouTube<\/strong><\/a>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/philip_worthington_shadow_monsters.gif\" width=\"480\" height=\"352\" \/><\/p>\n<p><strong>Golan Levin<\/strong>,\u00a0<em>Interstitial Fragment Processor<\/em> (2007) [<strong><a href=\"https:\/\/vimeo.com\/2340199\">Vimeo<\/a><\/strong>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/golan_levin_interstitial_fragment_processor.gif\" width=\"481\" height=\"273\" \/><\/p>\n<hr \/>\n<h2>Influence of the Kinect<\/h2>\n<p>The Kinect depth sensor obliviated many of the hardest problems in vision-based body understanding. Within days of its release, it was seized upon by new-media artists eager to explore its possibilities.<\/p>\n<p><strong>Robert Hodgin<\/strong>,\u00a0<em>Body Dysmorphic Disorder<\/em> (2010) [<a href=\"https:\/\/vimeo.com\/17073934\"><strong>Vimeo<\/strong><\/a>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/robert_hodgin_body_dysmorphic_disorder.gif\" width=\"480\" height=\"308\" \/><\/p>\n<p><strong>Karolina Sobecka<\/strong>,\u00a0<em>Sniff<\/em> (2010) [<a href=\"https:\/\/vimeo.com\/13791894\"><strong>Vimeo<\/strong><\/a>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/karolina_sobecka_sniff.gif\" width=\"480\" height=\"276\" \/><\/p>\n<p><strong>Chris Milk<\/strong>\u00a0et al.,\u00a0<em>The Treachery of Sanctuary<\/em> (2012) [<a href=\"https:\/\/www.youtube.com\/watch?v=ElLytAZK-Lc&amp;t=8s\"><strong>YouTube<\/strong><\/a>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/chris_milk_treachery_of_sanctuary.gif\" width=\"480\" height=\"304\" \/><\/p>\n<p><strong>Design-IO<\/strong>,\u00a0<em>Puppet Prototype<\/em> (2010).<br \/>\nJust days after the release of the Kinect, Theo Watson and Emily Gobeille created this quick prototype. [<a href=\"https:\/\/vimeo.com\/16985224\"><strong>Vimeo<\/strong><\/a>]<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/designio_puppet_prototype.gif\" width=\"481\" height=\"309\" \/><\/p>\n<p>This led to a commission to produce a larger work,<em> <a href=\"http:\/\/design-io.com\/projects\/PuppetParadeCinekid\/\" target=\"_blank\" rel=\"noopener\">Puppet Parade<\/a><\/em> (2011) [<a href=\"https:\/\/vimeo.com\/34824490\"><strong>Vimeo<\/strong><\/a>]<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/designio_puppet_parade.gif\" \/><\/p>\n<p><strong>Design-IO<\/strong>,\u00a0<em><a href=\"http:\/\/design-io.com\/projects\/NightBright\/\" target=\"_blank\" rel=\"noopener\">Night Bright<\/a><\/em>\u00a0(2011)<br \/>\n<em>Night Bright is an interactive installation of nocturnal discovery where children use their bodies to light up the nighttime forest and discover the creatures that inhabit it. Listening to the creatures\u2019 sounds children can locate them in the forest, as they play a nighttime game of hide and seek. <\/em>[<a href=\"https:\/\/vimeo.com\/29193895\"><strong>Vimeo<\/strong><\/a>]<iframe loading=\"lazy\" src=\"https:\/\/player.vimeo.com\/video\/29193895\" width=\"620\" height=\"349\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><strong>Design-IO<\/strong>,\u00a0<em><a href=\"http:\/\/design-io.com\/projects\/WeatherWorlds\/\" target=\"_blank\" rel=\"noopener\">Weather Worlds<\/a><\/em> (2013) [<a href=\"https:\/\/vimeo.com\/69084390\"><strong>Vimeo<\/strong><\/a>]<br \/>\n<em>Weather Worlds is an interactive installation that grants children weather controlling superpowers.<br \/>\nUtilizing a camera and real-time greenscreening, the installation allows children to see themselves immersed in an interactive and dynamic environment. The custom computer vision system tracks the heads, hands, feet and movement of children on the platform and also recognizes gestures. Using their bodies children can conjure a storm, release a twisting tornado or rain down bolts of lightning from their fingertips. There are mighty wind fields to move through, stomping earthquakes, light bending sunshine and blizzards that will make you shiver!<\/em><\/p>\n<p><img decoding=\"async\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/designio_weather_worlds.gif\" \/><\/p>\n<p>One of the core advantages of the Kinect is that it provides a\u00a0<em>skeleton<\/em>\u00a0for the body. This labels the parts of the body so that one knows, for example, the location of the head, the location of the arms, etc. Once you have this, it\u2019s easy to make conceptual transformations based on such identities. Here, for example, is\u00a0<a href=\"https:\/\/vimeo.com\/68637847\" target=\"_blank\" rel=\"noopener\"><em>MoMath: Human Tree<\/em><\/a>, a fractal body experience created by design studio, Blue Telescope: [<a href=\"https:\/\/vimeo.com\/68637847\"><strong>Vimeo<\/strong><\/a>]<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/museum_of_math_human_tree.gif\" \/><\/p>\n<hr \/>\n<h2>Some Dance\/Performance and Technology<\/h2>\n<p><strong><a href=\"http:\/\/www.exile.at\/ko\/\" target=\"_blank\" rel=\"noopener\">Klaus Obermaier<\/a><\/strong>,\u00a0<em>Apparition<\/em> (2004) <strong>[<a href=\"https:\/\/www.youtube.com\/watch?v=-wVq41Bi2yE\">YouTube<\/a>]<\/strong><br \/>\nOne of the first uses of augmented projection on a\u00a0computationally-tracked\u00a0body.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/klaus_obermaier_apparition.gif\" width=\"480\" height=\"480\" \/><\/p>\n<p>Obermaier also has a sense of humor, as in his <em>Ego<\/em> installation [<strong><a href=\"https:\/\/www.youtube.com\/watch?v=KzDifurF9wQ\">YouTube<\/a><\/strong>].<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/raw.githubusercontent.com\/golanlevin\/lectures\/master\/lecture_full_body_interactive_art\/images\/klaus_obermaier_ego1.gif\" width=\"480\" height=\"270\" \/><\/p>\n<p><strong><a href=\"http:\/\/chunkymove.com.au\/\" target=\"_blank\" rel=\"noopener\">Chunky Move<\/a><\/strong>,\u00a0<em>Mortal Engine<\/em> (2008) [<strong>YouTube<\/strong>]<br \/>\n<iframe loading=\"lazy\" src=\"https:\/\/www.youtube.com\/embed\/CHKLr_pvj2I\" width=\"620\" height=\"465\" frameborder=\"0\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><strong><a href=\"http:\/\/www.am-cb.net\/projets\/\" target=\"_blank\" rel=\"noopener\">Adrien M \/ Claire B<\/a><\/strong>,\u00a0<em>AMCB-introduction<\/em>\u00a0(2013) &amp;\u00a0<em>Pixel <\/em>\u00a0(2014) [<strong>YouTube<\/strong>]<br \/>\n<iframe loading=\"lazy\" src=\"https:\/\/player.vimeo.com\/video\/70849698?color=ffffff\" width=\"620\" height=\"349\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\"><\/iframe><br \/>\n<iframe loading=\"lazy\" src=\"https:\/\/player.vimeo.com\/video\/114767889?badge=0\" width=\"620\" height=\"349\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p>Bill T. Jones &amp; Google Creative Lab: <a href=\"https:\/\/experiments.withgoogle.com\/billtjonesai\"><em>Body, Movement, Language: AI Sketches<\/em><\/a> (2019). <strong>[<a href=\"https:\/\/www.youtube.com\/watch?v=RVyh1ewep84\">YouTube<\/a>]<\/strong><\/p>\n<p><iframe loading=\"lazy\" title=\"Body, Movement, Language: AI Sketches with Bill T. Jones\" width=\"840\" height=\"473\" src=\"https:\/\/www.youtube.com\/embed\/RVyh1ewep84?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen><\/iframe><\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"https:\/\/www.nytimes.com\/interactive\/2018\/us\/addiction-heroin-opioids.html\" rel=\"nofollow\">&#8220;A Visual Journey Through Addiction&#8221;<\/a>. Shreeya Sinha with Zach Lieberman and Leslye Davis.<em> New York Times<\/em>, 12\/18\/2018.\u00a0<a href=\"https:\/\/www.nytimes.com\/2018\/12\/20\/reader-center\/opioid-addiction-graphic-video.html\" rel=\"nofollow\"><em>More Info<\/em><\/a><\/p>\n<div style=\"width: 840px;\" class=\"wp-video\"><video class=\"wp-video-shortcode\" id=\"video-423-1\" width=\"840\" height=\"560\" preload=\"metadata\" controls=\"controls\"><source type=\"video\/mp4\" src=\"https:\/\/golancourses.net\/fall23\/wp-content\/uploads\/2023\/10\/ops-1-fade-1254w.mp4?_=1\" \/><a href=\"https:\/\/golancourses.net\/fall23\/wp-content\/uploads\/2023\/10\/ops-1-fade-1254w.mp4\">https:\/\/golancourses.net\/fall23\/wp-content\/uploads\/2023\/10\/ops-1-fade-1254w.mp4<\/a><\/video><\/div>\n<hr \/>\n<h3>Opportunities, and Previous Student Work<\/h3>\n<p><iframe loading=\"lazy\" title=\"Face-Powered Shooter\" src=\"https:\/\/player.vimeo.com\/video\/187285292?h=10969d63ff&amp;dnt=1&amp;app_id=122963\" width=\"840\" height=\"525\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-ready=\"true\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><span style=\"text-decoration: underline;\"><strong>You could make a body-controlled game.<\/strong><\/span>\u00a0An example is shown above; Lingdong Huang made this \u201c<em><a href=\"https:\/\/vimeo.com\/187285292\">Face Powered Shooter<\/a><\/em>\u201d in 60-212 in 2016, when he was a sophomore. Another example,\u00a0<a href=\"https:\/\/twitter.com\/nexusstories\/status\/1023984741001965571\"><em>Face Pinball<\/em><\/a>, is shown below.<\/p>\n<div style=\"width: 640px;\" class=\"wp-video\"><video class=\"wp-video-shortcode\" id=\"video-423-2\" width=\"640\" height=\"360\" preload=\"metadata\" controls=\"controls\"><source type=\"video\/mp4\" src=\"https:\/\/golancourses.net\/fall23\/wp-content\/uploads\/2023\/10\/facepinball.m4v?_=2\" \/><a href=\"https:\/\/golancourses.net\/fall23\/wp-content\/uploads\/2023\/10\/facepinball.m4v\">https:\/\/golancourses.net\/fall23\/wp-content\/uploads\/2023\/10\/facepinball.m4v<\/a><\/video><\/div>\n<p><span style=\"text-decoration: underline;\"><strong>You could make a sound-responsive costume.<\/strong><\/span>\u00a0You can develop a piece of interactive real-time audiovisual performance software (perhaps similar to\u00a0<em><a href=\"https:\/\/www.instagram.com\/p\/BMEyU8ZBtmb\/\">Setsuyakurotaki<\/a><\/em>, 2016, by Zach Lieberman + Rhizomatiks).<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-1566\" src=\"https:\/\/courses.ideate.cmu.edu\/60-212\/s2022\/wp-content\/uploads\/2022\/03\/ezgif.com-video-to-gif-11.gif\" alt=\"\" width=\"600\" height=\"337\" \/><\/p>\n<p><span style=\"text-decoration: underline;\"><strong>You could make a creativity tool, like a drawing program.<\/strong><\/span>\u00a0In 2019, Design junior Eliza Pratt\u00a0<a href=\"http:\/\/cmuems.com\/2019\/60212\/zapra\/10\/30\/zapra-situated-eye\/\">built this eye-tracking drawing program<\/a>\u00a0in 60-212.<\/p>\n<p><iframe loading=\"lazy\" title=\"TYPEFACE 2\" src=\"https:\/\/player.vimeo.com\/video\/9587564?h=6768b1e734&amp;dnt=1&amp;app_id=122963\" width=\"840\" height=\"473\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-ready=\"true\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><a href=\"https:\/\/vimeo.com\/9587564\">Mary Huang<\/a>\u00a0made a project to control the parameters of a typeface with her face.<\/p>\n<p><iframe loading=\"lazy\" title=\"Boundary Functions (1998) by Scott Snibbe\" src=\"https:\/\/www.youtube.com\/embed\/_Ax4pgtHQDg?feature=oembed\" width=\"840\" height=\"630\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><strong><span style=\"text-decoration: underline;\">You may capture more than one person<\/span>.<\/strong>\u00a0Your software doesn\u2019t have to be limited to\u00a0<em>just one body.<\/em>\u00a0Instead, it could visualize the relationship (or\u00a0<em>create<\/em>\u00a0a relationship) between two or more bodies (as in Scott Snibbe\u2019s\u00a0<em><a href=\"http:\/\/www.snibbe.com\/projects\/interactive\/boundaryfunctions\/\">Boundary Functions<\/a>\u00a0<\/em>or\u00a0<em><a href=\"https:\/\/www.instagram.com\/p\/BHH3rwhANKg\/\">this sketch by Zach Lieberman<\/a><\/em>). It could visualize or respond to a duet. It could visualize the interactions of multiple people\u2019s bodies, even across the network (for example,\u00a0<a href=\"https:\/\/glitch.com\/~cmuems-skeleton-networked\">one of Char\u2019s templates<\/a>\u00a0transmits shared skeletons, using PoseNet in a networked Glitch application.)<\/p>\n<p><iframe loading=\"lazy\" title=\"Interactive Puppet Prototype with Xbox Kinect\" src=\"https:\/\/player.vimeo.com\/video\/16985224?h=e833a692c6&amp;dnt=1&amp;app_id=122963\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-ready=\"true\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><span style=\"text-decoration: underline;\"><strong>You may\u00a0focus on just\u00a0<em>part<\/em>\u00a0of the body.<\/strong><\/span>\u00a0Your software doesn\u2019t need to respond to the entire body; it could focus on interpreting the movements of a single part of the body (as in Emily Gobeille &amp; Theo Watson\u2019s prototype for\u00a0<em><a href=\"https:\/\/vimeo.com\/16985224\">Puppet Parade<\/a>,\u00a0<\/em>which responds to a single arm).<\/p>\n<p><iframe loading=\"lazy\" title=\"Weather Worlds\" src=\"https:\/\/player.vimeo.com\/video\/69084390?h=feefa38e8d&amp;dnt=1&amp;app_id=122963\" width=\"840\" height=\"473\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-ready=\"true\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><span style=\"text-decoration: underline;\"><strong>You may\u00a0focus on how an environment is affected by a body.<\/strong><\/span>\u00a0Your software doesn\u2019t have to re-skin or visualize the body. Instead, you can develop an environment that is affected by the movements of the body (as in Theo &amp; Emily\u2019s\u00a0<em><a href=\"https:\/\/vimeo.com\/69084390\">Weather Worlds<\/a><\/em>).<\/p>\n<p><iframe loading=\"lazy\" title=\"Animating Non-Humanoid Characters with Human Motion Data\" src=\"https:\/\/www.youtube.com\/embed\/h_HAVnDCfkw?feature=oembed\" width=\"840\" height=\"473\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><iframe loading=\"lazy\" title=\"MoCap Head 2\" src=\"https:\/\/www.youtube.com\/embed\/LVUQ33xXdxc?feature=oembed\" width=\"840\" height=\"473\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-1556\" src=\"https:\/\/courses.ideate.cmu.edu\/60-212\/s2022\/wp-content\/uploads\/2022\/03\/animo.gif\" alt=\"\" width=\"691\" height=\"467\" \/><\/p>\n<p><span style=\"text-decoration: underline;\"><strong>You may control\u00a0the behavior\u00a0of something\u00a0non-human.<\/strong><\/span>\u00a0Just because your data was captured from a human, doesn\u2019t mean you must control a human. Just because your data is from a hand, doesn\u2019t mean it has to control a representation of a hand. Consider using your data to puppeteer an animal, monster, plant, or even a non-living object (<em>as in this research on \u201c<a href=\"https:\/\/www.youtube.com\/watch?v=h_HAVnDCfkw\">animating non-humanoid characters with human motion data<\/a>\u201d from Disney Research, and in this \u201c<a href=\"https:\/\/www.youtube.com\/watch?v=LVUQ33xXdxc\">Body-Controlled Head<\/a>\u201d (2018) by 60-212 student, Nik Diamant<\/em>). Here\u2019s a simple sketch for a quadruped which is puppeteered by your hand (<a href=\"https:\/\/editor.p5js.org\/khanniie\/sketches\/q9O-7zDFx\">here<\/a>).<\/p>\n<p><span style=\"text-decoration: underline;\"><strong>You could make software which is\u00a0<em>analytic<\/em>.<\/strong><\/span> You might instead elect to create an\u00a0 \u201cinformation visualization\u201d that presents an ergonometric analysis of the body\u2019s movements over time. Your software could present comparisons different people making similar movements, or could track the accelerations of movements by a violinist.<\/p>\n<p><iframe loading=\"lazy\" title=\"What You Missed\" src=\"https:\/\/player.vimeo.com\/video\/1609109?h=92aa71cfa7&amp;dnt=1&amp;app_id=122963\" width=\"504\" height=\"336\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-ready=\"true\" data-mce-fragment=\"1\"><\/iframe><\/p>\n<p><span style=\"text-decoration: underline;\"><strong>You could make something altogether unexpected.<\/strong><\/span>\u00a0Above is a project,\u00a0<a href=\"https:\/\/vimeo.com\/1609109\"><em>What You Missed<\/em><\/a>\u00a0(2006), by CMU student Michael Kontopoulos. Michael built a custom blink detector, and then used it to take photos of the world that he otherwise missed when blinking.<\/p>\n<div><iframe src=\"https:\/\/player.vimeo.com\/video\/42719594?h=6a114400db&amp;title=0&amp;byline=0&amp;portrait=0\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-ready=\"true\" data-mce-fragment=\"1\"><\/iframe><\/div>\n<p><a href=\"https:\/\/vimeo.com\/42719594\">Cheese<\/a>\u00a0by\u00a0<a href=\"https:\/\/vimeo.com\/cmo\">Christian Moeller<\/a>\u00a0is an experiment in the \u201carchitecture of sincerity\u201d. On camera, six actresses each try to hold a smile for as long as they could, up to one and half hours. Each ongoing smile is scrutinized by an emotion recognition system and whenever the display of happiness fell below a certain threshold, an alarm alerted them to show more sincerity. The performance of sincerity is hard work.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Erkki Kurenniemi,\u00a0DIMI-O\u00a0(1971) Used the body, as tracked by the camera, as the basis of a\u00a0musical instrument. Dimi-O (1971) is based on an optical interface, the purpose of which was to read sheet music graphically. The instrument was played by means of a video camera. DIMI-O was also performed by a dancer, whose movements were transformed &hellip; <a href=\"https:\/\/golancourses.net\/fall23\/daily-notes\/october\/10-23\/full-body-interactive-art\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Full-Body Interactive Art&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":421,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-423","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/pages\/423","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/comments?post=423"}],"version-history":[{"count":7,"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/pages\/423\/revisions"}],"predecessor-version":[{"id":493,"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/pages\/423\/revisions\/493"}],"up":[{"embeddable":true,"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/pages\/421"}],"wp:attachment":[{"href":"https:\/\/golancourses.net\/fall23\/wp-json\/wp\/v2\/media?parent=423"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}