dechoes – 2D physics

This gargle activated sketch can be found online here.


I cannot say that this warmup had any particularly important content driving it, I was mainly interested in getting used to Box2d and merging it with other interactive qualities. I tried a couple things first using face tracking (as I wanted the water to be generated through a gargling gesture), mouseIsPressed, or with mouseX, mouseY, and finally settled on having it be voice activated. The size and quantity of water droplets being spurted out of the faucet are driven by the threshold of the audio, generated by a gargling spectator (Although there is no sound in the documentation, one could imagine me frantically gargling alone in my bedroom at 3am).

The creation of this assignment was heavily supported by Dan Schiffman’s Box2D documentation.



Inspired by the effect of Pendulum Waves, I made an iPhone app with Unity that simulates this effect, using the accelerometer of the phone.

My goals are to recreate the visual effect of diverging/converging patterns of pendulums swaying at different frequencies and to introduce interaction with the accelerometer inside of a phone. The final product does not recreate the diverging/converging patterns, yet it presents a mesmerizing wave pattern with the threads of pendulums and produces sound that corresponds to the pattern.

My first iteration:

First iteration

My second iteration:

Second iteration

Although not originally planned, I connected the pendulums and the anchor with gradient threads, which created a very mesmerizing wave-like effect. For a more interesting interactive experience, I added sound to each pendulum, such that when each pendulum makes a click sound when it returns to the center axis.

The final version:

Final version

I used Jacqui Fashimpaur and Alexander Woskob’s help with Unity.


Alphabet Blocks & Balloons

These alphabet blocks and balloons reveal when they’ve been arranged into a word. The user types on their keyboard to create blocks and balloons for each letter.

With this project, I wanted to develop a playful application that would be appealing to children and that had the potential to be broadened to encompass several languages at once. This 2-D physics game, where blocks can be brought into the world with the keyboard and floated away on balloons, does that, to some extent. The list of English words that it pulls from could be broadened to include other languages that use Roman alphabet letters as well, allowing the discovery not only of words you didn’t know were there, but of words you didn’t know. Unfortunately, one of my major limits on adding these other dictionaries right now is optimization, and as I go through my English dictionary very frequently in my code, I slow down my app considerably.


Here are some pages from my sketchbook:



Matter.js + Webcam Hand Tracking: Real Hand Puppeteering

This project allows users to puppeteer a rag doll with their real hands in their browsers through gesturing in front of webcam. (

^ an animated GIF showing a play session

I came up with the idea since it appears to me that human body tracking and physic engines combines well. Maybe it’s because we always wanted touch and feel objects.

I never learned to puppeteer in real life. I thought rag doll physics might simulate it well, so it’s also a chance to try out puppeteering virtually.

Hand tracking has been around for a long time, but I want to make a new tracker in js for the browser so everyone can play without acquiring hardware or software.

I used matter.js for the physics engine, mainly because I wanted to try something new. But in retrospect, maybe box2d.js would have worked better. However, matter.js seem to have an easier API compared to box2d.js’s automatically ported C-style code.

^ screenshot showing debug screen and camera view.

Custom Software for Hand Tracking in the Browser

Demo of finger tracking available on Glitch:

^ Screenshot showing detection and tracking of multiple hands and fingers

OpenCV.js is used to write all the computer vision algorithms. I used HSV ranges, convex hulls etc. to find fingers in web cam image.


  • Face and hands of the same person have at least somewhat similar color
  • Background is not exactly same color as skin
  • Person relatively near the camera, and their face is visible
  • Person is not totally naked
  • Person not wearing gloves or have stuff over their faces

I’ve worked with OpenCV in C++ and python, but haven’t used the JavaScript port. So this time I gave it a try. I think the speed is actually OK, and API is almost the same as in C++.

^ Testing algorithm before different backgrounds and lighting situations to make sure it is robust.


I think the result is quite fun, but I’m most bothered by matter.js’s rag doll simulation (which I based on their official demo). Sometimes the rag dolls fly away for no apparent reason. One probability is that I’m missing something in the parameters that I need to tweak.

Another problem is that hand tracking is lowering the FPS a lot. When there’s either only physics or only hand tracking, it’s pretty smooth, but when there’s both, things started to get slow.

In terms of puppeteering, I was only able to make the puppet jerk its arm or move a leg. There’s no complex movements such as walking, punching, etc. But I think it’s still interesting to experiment with.

In the future I can also add other types of interactions to the system, for example shooting stuff from fingers, grabbing/pushing objects, etc.

Video Documentation



For my 2D physics interaction, I made a squishy frog head that drops through platforms controlled by the mouse. A collision with each platform causes the frog to play a different note. It was created with Processing, Daniel Shiffman’s Box2D-for-Processing, and the SoundCipher library. 

I struggled with coming up with an idea for this assignment because I wanted to make good use of the physics library, but wasn’t sure how to make something interesting and fun to interact with. At first the frog was just a rigidbody, but I took inspiration from this video to make a circle out of distance joints instead. I think the squishiness really improved the assignment, and I’m glad that it allowed me to play with the physics a bit more.


I made a bread eating race game with face tracking. I think I’ve mostly accomplished what I set out to do: the game works as desired and the face tracking works better than I’d hoped for. I think some details could be added, like having it so that while the dogs hang and gnaw on the baguettes, the baguettes should swing with the motion, or having the baguettes rotate instead of just translate with the motion of the rope, but with time constraints this is it. Ideally I would upload this on OpenProcessing or some place else so that people could play it, but for OpenProcessing at least, I’m not sure how to combine all my files or how to deal with my image files.




This project is a slow tag game that uses proximate greyscale colors.
instructions: you move around the big cube slowly in the environment. you try to locate and bump into the other big cube to shift the colors.
It aims to induce illusions and test the limit of patience. The physics library and the tag game work together to create a borderline organic impression. The minimal color difference, the slow frame rate, and the limitation on how fast you can move your avatar encourage a slow pace to play and observe. The shifting color succesfully makes the experience more disorienting and straining. The project is made with Dan Shiffman Box2D for processing library.


Bouncy creatures are always fun. That’s where my brain went to first with this assignment. I had previously done a springy pigeon that was incredibly satisfying to play with so I had wondered could I build on that with Box2D?

(background SVG art not made for this assignment, pigeon remade in Matter.js soft bodies and then went with constraints)


Thoughts for physical device driven interactions — applying web events (device orientation and device motion) to the forces in the matter js engine.

Variation on springs and bouncy birds

But then I figured I had a week so I should take more time to think of more interesting inputs and outputs committing to something I already knew would have satisfying, albeit rather expected, interaction —

General ideas that would make good use of physics
Thoughts on input methods / when and why some would be more appropriate than others
Trying to integrate input with interesting output

And what about an idea that I could build on later? Maybe the beginning of a series of things?

Or I should just think of it as which simulations do I want to do? Because the fluid simulation library looks so fun, but I haven’t figured out a narrative/interaction/input/output that is compelling and justifies it. But again, this is an exercise so maybe I should just do it because I want it.



…..a 5 days later….



Ok. So I turned the ideas around in my head and something just made me really happy thinking about a future where I make a collection of different games for different muscle groups. I imagined some future where the opponents in the exercise games were actually other users or your friends that you have challenged remotely, and then its like crowdsourcing human computation to generate new/unique emergent behavior for the opponents every time you play. Theres an app that plays zombie noises to make a regular jog feel like a chase, but that sort of gamification for motivation gets old pretty quickly. A concern I do have though is…well, aren’t wii games essentially games that made you move? But they fell out of favor…..hmm……….

….ok had an interesting conversation with Jacquar and Gray about why Wii felt gimmicky and eventually lost favor, vs. traditional minimal movement game controls and then touching on wii physical motion in games vs. motion in VR games. There’s an interesting dichotomy to analyze here. Will circle back on these thoughts after I make a MVP that doesn’t look like this:

Pigeon with Matter.js physics gone wrong from Marisa Lu on Vimeo.

So it started off with recreating physics with Matter.js, figured it would be simple enough to get a chain mechanism for the pigeon leg pinned to a particular location, and then! Then, it had an unexpectedly animate reaction that happens to self-sustain infinitely.

…..I wonder, does this ^ count as emergent behavior? The way a double pendulum movement might as a chaotic system.


So back to the an initial idea, a pigeon hanging precariously over an ocean of blood relying on the user jerking the ‘spring’ (their device) up to keep the poor bird out of the sharks jaws.

Bloodbath from Marisa Lu on Vimeo.

Later tied it to device orientation and any acceleration got converted into a ‘add force’ on the bodies in the Matter engine. (kinda, and then broke it while figuring out some browser issues but yay, if ya’ll ever need a full screen browser, I highly recommend BrowserMax)

feel free to try on a device browser, would recommend BrowserMax for fullscreen


Bubble Pop – Pinch and Smash the Bubbles!

I began this project thinking about the satisfaction people get from popping and smashing squishy things. Whether it is popping pimples and blisters or exploding fruit, we love to pop things. Gallagher has made an entire career out of it.

I was pleased with how the LiquidFun physics engine models the behavior of popping. I created bubbles out of a ring of rectangles connected by rotating joints encapsulating loose particles. When pierced or pinched, these bubbles build internal pressure until the skin collapses catastrophically and the contents explode outwards. The effect is mesmerizing and there are endless types of bursts the user can discover through experimentation.

Live Version: