Templates for Gestural Input
Code templates for mouse, microphone, and webcam (hands, face, body) input.
Demonstration of mouse interaction techniques.
This demonstrates how various aspects of mouse movement (such as speed, click frequency, proximity to a coordinate, or angle-with-respect-to a coordinate) can be used as elements of an interaction vocabulary:
Demonstration of microphone interaction techniques (and another clean simple demo).
This demonstrates how the microphone level can be used to control various visual properties of graphical elements, such as their size, color, rotation, strokeWeight, etc.:
Template p5.js code for hand pose tracking.
This project can track multiple hands in a webcam video. The tracker provides an array of hand objects, each of which contains an array of keypoints (x-y points) which indicate specific landmarks on the hand. The tracker is also able to distinguish left hands from right hands.
p5.js Simple Hand Puppet Example (2024). Graphics are affixed to the tracked hand points.
Template p5.js code for face tracking.
This project tracks a single face in webcam video. In addition to calculating an array of 468 keypoints, the project also provides 13 “named” landmarks (e.g. “nosePt”, “chinPt”, etc.):
p5.js Simple Mask Example.
Demonstrates how to mount graphics on your face. Based on the face tracker above.
The project is also sensitive to the microphone volume.
Template p5.js Code for Body Tracking.
This project tracks bodies in webcam video. It provides 33 landmarks (nose, right hip, left elbow, etc.):
p5.js Simple Body Costume Example.
Demonstrates how to mount graphics on your tracked body landmarks:
Advanced/Older Templates (Note: Possibly Obsolete)
Here is an ADVANCED template code for Hand, Face, and Body Capture with Handsfree.js:
(ALERT, this is an advanced template that could have some pain points for beginners.) Note that this demo uses an older tracking library which takes a very long time to load, and may be obsolete.
Improved & Simplified Hands, Body & Face Tracker Template. This is a simplified demo showing how you can access a variety of points by name (“nosePt”, “chinPt”, etcetera). Note that this demo uses an older tracking library which takes a very long time to load, and may be obsolete.
This “Frog” demo illustrates how you can use math to position shapes in interesting ways. The size of the frog’s eyes are controlled by raising your eyebrows; the width of the face is controlled by smiling; and the eyes look at your index finger if it’s visible. Note that this demo uses an older tracking library which takes a very long time to load, and may be obsolete.
Connie’s quadruped template: using hands for puppeteering (sketch):
Arm (compound rotations):
- Zeno’s Interpolation Demo (Position)
- Zeno’s Interpolation (Color)
- Arm • p5 Arm with Sinusoid
- Reach 1
- Follow 2