greecus-2DPhysics

My goal for this project was to play with the concept of what 2D really is by showing that our eyes use certain cues such as size and shade to imply depth and 3D space even though the images are ultimately reduced to 2D images on our retinas. This fallacy of depth intrigued me and I wanted to see how I could use certain visual cues to my advantage in a way that made the viewer interact with a falsehood.

After a few iterations, I felt that the way that the balls grew and shrunk had a biological feel to it and seemed sinusoidal in nature, so I ended up spending hours to find the right pattern that would give a feeling of depth without seeming too rigid.

 

I tried to add another component where on a mouse click the underlying sinusoidal equation would be convolved with a Gaussian equation to give an added level of interaction, but the processing required to constantly apply the convolution was too computationally heavy for my machine.

 

This project was possible because of Dan Schiffman’s Box2D documentation

tli-2Dphysics

My goals for this assignment was to create a simple but pleasing fidget and familiarize myself with utilizing open source libraries in the process. I particularly wanted to explore shape creation and contact listening in JBox2D, rendering in Processing and sound generation through Java APIs.

I was not able to explore the libraries to the extent I wanted to. I attempted to utilize a masking and filtering while rendering with Processing but discarded that option due to bugs. I was also unable to add sounds, which would have added to the fidget-like quality of this work. However, the interactions I implemented between the colliding and color-shifting shapes were pleasing, so I believe I attained my goal of creating a satisfying fidget.

I used JBox2D, Daniel Shiffman’s Box2D for Processing library and Processing for this project.

arialy-2DPhysics

The sketch is live here!

]

My goal for this project was to create a visualization of period cramps. I’ve first hand experienced many cramps so intense that I’ve broken into a cold sweat, vomited, and tried to chase the extreme pain with liquid-gel Advils.

In this visualization, clicking will spawn blue pills. The blood particles will disappear when they make contact with the capsules, but the pills themselves will too disappear over time.

This project turned out differently than I initialized imagined. I’d hoped to create more “blood clots” by linking together the red polygons, but had issues with the frame rate dropping too low. Though this sketch is missing many nuances of the experience its based off of, I’m relatively happy with how it turned out.

 

 

Dorsek-2DPhysics

For this project I wanted to develop something that closely aligned with a portion of my senior project which has to do with the experience of dreaming; specifically the sensation of needing or wanting to do something but being frustrated with your lack of an ability to; the harder you focus, the harder whatever it is you want to do becomes.

Struggling to get the teeth into the basket
Original sketch of idea…

 

 

 

 

 

 

I took one of many dreams that I recorded during a 3-month bout of insomnia and decided to re-create it  using matter.js as my physics engine of choice.

Initially, I had plans for many interactions based in matter.js but it turned out to take me a bit longer than expected in order to familiarize myself with the library.

jaqaur- 2D Physics

This is PuppetBot. He’s a 2D marionette controlled by LeapMotion.

I had this idea when I was brainstorming ways to make interactions with the physics engine more engaging than just clicking. Puppets like this are familiar to people, so I thought it would be nice and intuitive. The  physics engine I used is Box2D, and the code is all pretty simple; I have a set of limbs that are attached to each other by joints. I also have some ropes (really just lots of little rectangles) connecting those limbs to points whose coordinates are determined based on the coordinates of your actual fingertips. [I will put my code on GitHub soon, but I need to go back through it and make sure everyone is credited appropriately first]

A lot of the decisions I made for this project were in the interest of making the puppet easier to use. For example, I lengthened the segments in the ropes so that movements didn’t take as long to “wave” down the rope before they affected the puppet. This is also why I made the ropes fairly short, instead of having them permanently reside above the top of the window, as I had originally planned. I made the body parts as heavy as I could so they wouldn’t bounce all over the place, but if I made them too heavy they strained and broke the ropes. I played around with friction/density/restitution settings a lot to try and find the perfect ones, but to no avail. I did a few other things just for personal aesthetic reasons, like making it so that the knees never bend backwards (this happened a lot before and I didn’t think it looked good). I went with the robot design because I thought it would fit best with the jerky/unbalanced movement I was bound to get. I think it look like he’s short-circuiting whenever the physics gets broken:

Ultimately, PuppetBot is not as easy to control as I would have liked, but he’s still pretty fun to  play with. And all things considered, I’m not much better with real marionettes, either…

gray-2Dphysics

 

Arc involves the use of Unity’s Visual Effect Graph compute shader particle simulator.

I’ve been giving thought recently to the usefulness/necessity of visually connecting structures to communicate the interrelation of material mechanisms. I was interested in exploring how connective visual structures might emerge out of disparate particles, with the eye toward eventually applying the work in 3D within a VR interface.

I’m quite happy with the results, discovering how mapping strength of the gravitational pull to the distance between the points produces emergent and lively structures that are perhaps continually surprising.

graycrawford.itch.io/arc

ya-2Dphysics

Try this experience here.

flow is an interactive experience where you can control a fluid simulation with your mouse or webcam. As you move around, the fluid particles get pushed around by your movement, revealing your own image as they pass by.

I’ve personally been a fan of fluid simulations, so I first wanted to explore Google’s LiquidFun library first. One observation that I noticed from the liquidfun demos, however, was that the particles seemed to always be controlled via means other than themselves: they were shot out of a spawner, or pushed around by rigid bodies. Because I wanted to have a more direct interaction with the particles themselves, I first chose to move them around with the mouse cursor.

The colors were chosen in an attempt to make the particles feel organic and natural: Instead of making the particles look like water, I wanted them to feel like fireflies in a night sky, or a flock of creatures moving around.

Although the mouse interaction felt fluid, I wanted to have an even closer connection between the player interaction and the experience. I then investigated optical flow libraries – algorithms that took color video as input and analyzed movement to produce a velocity map. I found a public library called oflow by GitHub user anvaka, and decided to integrate a webcam stream into the experience.

With the webcam stream, I feel the experience takes on a much different feel than the single color particles moved around with the mouse. When particles are pushed around, they now occlude and reveal different areas of the screen, creating a constantly evolving mirror.

Resources used:

Google liquidfun

oflow by anvaka

sketch.js by soulwire

2D Physics – sheep

Chicken Physics Control in Unity2d

Description: In this interactable I wanted to experiment with motion, specifically how one might make a convincing chicken using Unity’s 2D physics. For me, it was fun to think about the action of pulling worms out of the ground and play with the tools, rather than do anything too complicated. The result is a funny little experience in which you play as a giant headed chicken.

My goal was to make a game, but this doesn’t have an explicit goal or UI or instructions. Instead I would call it a prototype. The controls are left and right or A and D to move the chicken around, and the mouse to reposition its head and the left mouse to bite down. I initially had plans for a larger sort of story about a chicken whose eating gets too big for its surroundings, but the systems are in place to tell a story like that in the future.

Gif:

Video:

Link:

http://perebite.itch.io/chicken-physics-prototype

geebo-2Dphysics

I wanted to create a project that used some kind of gaze tracker after I had seen a simple experiment using a webcam published on the chrome experiments page. I also loved the idea of a project that purposely avoids being seen!

My initial idea was to create a simple 2D platformer where players own ability to look at their own character would be hindered by a massive physics repulsion from their own gaze, making jumping, running, etc. much more difficult. I created simple flying rectangles that were planned to be enemies/obstacles that were also disrupted by gaze.

However, after some initial experimentation, I became fascinated by the behavior and visual nature of the simple system. I found that the small beads took on their own life and seemed to squirm away whenever they were put into the microscope. Rather than force them into the platformer, I gave them a small voice to portray their anger at being observed.

 

a-2DPhysics

I was interested in using a fluid simulation (i used this one), but I couldn’t think of ways to satisfactorily visualize and interact with it that hadn’t been extensively explored before.

I thought to use boids to drive the interaction with the liquid, and have the liquid then apply forces to the boids, to create interesting emergent interaction.  This made it hard to balance the system, as this is a positive feedback loop.

I experimented with a number of ways to visualise the boids and the liquid, with the boids being above at first, then moving behind unless they were under a specific size (to give the impression of them “jumping out”).

However, neither of these gave the liquid “depth” and “murkiness,” which were my emergent goals for this piece as it went on. Eventually, I realised I could just make the liquid black, and so I did.

I am satisfied by the result of exploring a fluid simulation system in a new (to me) way, with “live” <s>creatures</s> boids. I am especially happy with the murky, inky effect that sometimes happens. Looking back, I wish I had added more dampening to the boids. In addition, it would be interesting to explore how they interact with a GPU fluid system, letting one simulate the fluid at a much finer resolution (here it is just 64×64).

You can see the sketch (and the source code) at https://editor.p5js.org/aman/full/HysZkJ6fV.

tadpoles from aman tiwari on Vimeo.