The inspiration for this project was my love for the rain and the snow. I find them to be rather calming and I just enjoy experiencing that weather. Thus, my goal for this project was to create a window to that experience.
Using Dan Shiffman’s The Nature of Code, I used the Box2D for Processing library to create this. I thought that this was a bit technically challenging since it’s been a while since I used Processing and the trigonometry with Box2D’s world was hard to get right. Nevertheless, I find that my goals for this project was met, but I think that it would be much stronger if I created or found some ambient music to pair with it.
I wasn’t sure if I wanted this project to be horizontal or vertical. I like both orientations, but if I were to display this on a wall I would use a very long horizontal orientation.
A dog hanging from a chain of hands, where the gravity in game is affected by the gyroscope data of the mobile device. On impact with the wall, small creatures will pop out!
Although I had originally wanted to build a dog-builder game (depicted below), I ended up with something rather different. I’m still happy with how it turned out, because I learned a lot about Unity’s physics engine and still made something that I find novel to interact with. At first, I wasn’t fond of how the body parts of the dog could move out of the place with each other and distort the dog body, but now I’m happy that it adds an element of surprise and makes the dog more dynamic.
My rough final draft of the project didn’t include the fluffy, springy walls and instead had rigid, unmoving walls. This iteration of the app looked very violent because it was essentially just a dog smashing against the walls. However, I received a lot of a good feedback (such as fluffy walls + springs on the wall colliders + dog face blinking animation) during crit that made the game a lot dog-friendlier.
Thank you to Lukas for help with the dog’s torso shader, to Tat for her help on the visual look, and to Grey and Aman for their advice on the swinging dog!
I had originally hoped to make a dog builder application, as depicted below. Because it was physics based, I thought the idea of using real mobile to influence gravity would be a neat interaction, so I built most of my program around the idea of a blob of flesh swinging around on a rope. Users would then be able to add dog parts to the blob of flesh, making many permutations of a dog.
Early prototype of the app; I built the dog’s body from colliders in Unity.
“Interpellation is a process, a process in which we encounter our culture’s values and internalize them. Interpellation expresses the idea that an idea is not simply yours alone (such as ‘I like blue, I always have’) but rather an idea that has been presented to you for you to accept. ”
What I ended up making is a face-tracking interaction (using clmtrackr.js ) where every-time the viewer opens their mouth, the word ‘interpellation’ spills out and bounces around until you close your mouth. I thought this would be a humorous way to engage with a relatively simply physics interaction. The words have a floor and ceiling they interact with, and I was able to get moderate control over the manner/speed/animation style in which they bounced around and interacted with the other word-rectangles. Overall, I wish I had been able to get further into physics systems, but I am happy with the progress I made in this new (to me) creative coding environment. I am interested in future possibilities of exploring face-tracking, too.
My goal for this project was to play with the concept of what 2D really is by showing that our eyes use certain cues such as size and shade to imply depth and 3D space even though the images are ultimately reduced to 2D images on our retinas. This fallacy of depth intrigued me and I wanted to see how I could use certain visual cues to my advantage in a way that made the viewer interact with a falsehood.
After a few iterations, I felt that the way that the balls grew and shrunk had a biological feel to it and seemed sinusoidal in nature, so I ended up spending hours to find the right pattern that would give a feeling of depth without seeming too rigid.
I tried to add another component where on a mouse click the underlying sinusoidal equation would be convolved with a Gaussian equation to give an added level of interaction, but the processing required to constantly apply the convolution was too computationally heavy for my machine.
This project was possible because of Dan Schiffman’s Box2D documentation
My goals for this assignment was to create a simple but pleasing fidget and familiarize myself with utilizing open source libraries in the process. I particularly wanted to explore shape creation and contact listening in JBox2D, rendering in Processing and sound generation through Java APIs.
I was not able to explore the libraries to the extent I wanted to. I attempted to utilize a masking and filtering while rendering with Processing but discarded that option due to bugs. I was also unable to add sounds, which would have added to the fidget-like quality of this work. However, the interactions I implemented between the colliding and color-shifting shapes were pleasing, so I believe I attained my goal of creating a satisfying fidget.
I used JBox2D, Daniel Shiffman’s Box2D for Processing library and Processing for this project.
My goal for this project was to create a visualization of period cramps. I’ve first hand experienced many cramps so intense that I’ve broken into a cold sweat, vomited, and tried to chase the extreme pain with liquid-gel Advils.
In this visualization, clicking will spawn blue pills. The blood particles will disappear when they make contact with the capsules, but the pills themselves will too disappear over time.
This project turned out differently than I initialized imagined. I’d hoped to create more “blood clots” by linking together the red polygons, but had issues with the frame rate dropping too low. Though this sketch is missing many nuances of the experience its based off of, I’m relatively happy with how it turned out.
For this project I wanted to develop something that closely aligned with a portion of my senior project which has to do with the experience of dreaming; specifically the sensation of needing or wanting to do something but being frustrated with your lack of an ability to; the harder you focus, the harder whatever it is you want to do becomes.
I took one of many dreams that I recorded during a 3-month bout of insomnia and decided to re-create it using matter.js as my physics engine of choice.
Initially, I had plans for many interactions based in matter.js but it turned out to take me a bit longer than expected in order to familiarize myself with the library.
This is PuppetBot. He’s a 2D marionette controlled by LeapMotion.
I had this idea when I was brainstorming ways to make interactions with the physics engine more engaging than just clicking. Puppets like this are familiar to people, so I thought it would be nice and intuitive. The physics engine I used is Box2D, and the code is all pretty simple; I have a set of limbs that are attached to each other by joints. I also have some ropes (really just lots of little rectangles) connecting those limbs to points whose coordinates are determined based on the coordinates of your actual fingertips. [I will put my code on GitHub soon, but I need to go back through it and make sure everyone is credited appropriately first]
A lot of the decisions I made for this project were in the interest of making the puppet easier to use. For example, I lengthened the segments in the ropes so that movements didn’t take as long to “wave” down the rope before they affected the puppet. This is also why I made the ropes fairly short, instead of having them permanently reside above the top of the window, as I had originally planned. I made the body parts as heavy as I could so they wouldn’t bounce all over the place, but if I made them too heavy they strained and broke the ropes. I played around with friction/density/restitution settings a lot to try and find the perfect ones, but to no avail. I did a few other things just for personal aesthetic reasons, like making it so that the knees never bend backwards (this happened a lot before and I didn’t think it looked good). I went with the robot design because I thought it would fit best with the jerky/unbalanced movement I was bound to get. I think it look like he’s short-circuiting whenever the physics gets broken:
Ultimately, PuppetBot is not as easy to control as I would have liked, but he’s still pretty fun to play with. And all things considered, I’m not much better with real marionettes, either…
Arc involves the use of Unity’s Visual Effect Graph compute shader particle simulator.
I’ve been giving thought recently to the usefulness/necessity of visually connecting structures to communicate the interrelation of material mechanisms. I was interested in exploring how connective visual structures might emerge out of disparate particles, with the eye toward eventually applying the work in 3D within a VR interface.
I’m quite happy with the results, discovering how mapping strength of the gravitational pull to the distance between the points produces emergent and lively structures that are perhaps continually surprising.
flow is an interactive experience where you can control a fluid simulation with your mouse or webcam. As you move around, the fluid particles get pushed around by your movement, revealing your own image as they pass by.
I’ve personally been a fan of fluid simulations, so I first wanted to explore Google’s LiquidFun library first. One observation that I noticed from the liquidfun demos, however, was that the particles seemed to always be controlled via means other than themselves: they were shot out of a spawner, or pushed around by rigid bodies. Because I wanted to have a more direct interaction with the particles themselves, I first chose to move them around with the mouse cursor.
The colors were chosen in an attempt to make the particles feel organic and natural: Instead of making the particles look like water, I wanted them to feel like fireflies in a night sky, or a flock of creatures moving around.
Although the mouse interaction felt fluid, I wanted to have an even closer connection between the player interaction and the experience. I then investigated optical flow libraries – algorithms that took color video as input and analyzed movement to produce a velocity map. I found a public library called oflow by GitHub user anvaka, and decided to integrate a webcam stream into the experience.
With the webcam stream, I feel the experience takes on a much different feel than the single color particles moved around with the mouse. When particles are pushed around, they now occlude and reveal different areas of the screen, creating a constantly evolving mirror.