gray – FinalProject

Bow Shock

Bow Shock is a musical VR experience where the motion of your hand reveals a hidden song in the action of millions of fiery particles.

In Bow Shock, your hands control the granulation of a song embedded in space, revealing the invisible music with your movement. You and the environment are composed of fiery particles, flung by your motions and modulated by the frequencies you illuminate, instilling a peculiar agency and identification as a shimmering form.

Bow Shock is a spatially interactive experience of Optia‘s new song of the same name. Part song, part instrument, Bow Shock rewards exploration by becoming performable as an audiovisual instrument.

Bow Shock is an experiment with embodying novel materials, and how nonlocal agency is perceived.

Two particle systems make up the hand, responding to the frequencies created by your hand position. Pinching mutes the sounds created.

 

I wanted to create another performable musical VR experience after having made Strata . Initially stemming from a previous assignment for a drawing tool, I wanted to give the user a direct engagement with the creation of spatial, musical structures.

I envisioned a space where the user’s hand would instantiate sound depending on its location and motion, and I knew that granulation was a method that could allow a spatial “addressing” of a position within a sound file.

I created an ambient song, Bow Shock, to be the source sound file. I imported the AIFF into Manuel Eisl’s Unity granulation script and wrote a script that took my hand’s percent distance between two points on the x-axis and mapped that to the position within the AIFF that the granulator played at. Thus, when I moved my hand left and right, it was as if my hand was a vinyl record needle “playing” that position in the song. However, the sound was playing constantly, with few dynamics. I wrote a script mapping the distance between my index fingertip and thumbtip to the volume of the granulator, such that pinching completely pinched the sound off. This immediately provided much nuance in being able to “perform” the “instrument” of the song and allow choice of when to have sound and with what harmonic qualities.

Strata was made using cloth physics, and in an attempt to move away from polygonal structures, my work with the particles in Unity’s VFX Graph made them a great candidate for this. Much of the work centered around the representation of the hand in space. Rather than using the Leap-Motion-tracked hand as a physical collider, I used Aman Tiwari’s MeshToSDF Generator to have my hand be a field influencing the motion of the particles.

I have two particle systems associated with the hand. The soft white particle system is constantly attempting to conform to the hand SDF, and I have the parameters set such that they can easily be flung off and take time to be dragged out of their orbit back onto the surface of my hand. The other, more chaotic particle system uses the hand SDF as boolean confinement, trapped inside the spatial volume. Earlier in the particle physics stack, I apply an intense and dragful turbulence to the particles, such that floating-point imprecision at those forces glitches the particle motions in a hovering way that mirrors the granulated sound. These structures are slight, and occupy a fraction of the space, such that when confined to within the hand only, predominantly fail to visually suggest the shape of the hand. However, in combination, the flowy, flung particles plus the erratic, confined particles reinforce each other to mesh sufficiently with proprioception to be quite a robust representation of the hand that doesn’t occlude the environmental effects.

What initially was a plan to have the user draw out the shape of the particle instantiation/emission in space evolved into a static spawning volume matching in width the x-extent of the granulated song. In the VFX graph I worked to have the particles inherit the velocity of the finger, which I accomplished by lerping their velocities to my fingertip’s velocity along an exponential curve mapped to their distance from my fingertip, such that only the most near particles inherited the fingertip velocity, with a sharp falloff to zero outside of ~5cm radius away from my fingertip.

The particles initially had infinite lifetime, and with no drag open velocity, after flicking them away from their spawning position they often quickly became out of reach, so I instantiated subtle drag and gave the particles a random lifetime between 30 and 60 seconds such that the flung filamentary particle sheets would remain in the user-created shape for a while before randomly dying off and respawning in the spawn volume. The temporal continuity of the spawn volume provides a consistent visual structure correlated with the granulation bounds.

Critically, I wanted the audio experience to not be exclusively sonic. I wanted some way that the audio affected the visuals, so I used Keijiro’s LASP Loopback Unity plugin to extract the separate amplitudes for low, medium, and high chunks of the realtime frequency range, and mapped their values to parameters available in the VFX Graph. From there, I mapped the frequency chunk amplitudes to particle length, the turbulence on the particles, and the secondary, smaller turbulence on my hand’s particles. Thus, at different parts of the song, for examples, the predominating low frequencies will cause the environmental particles to additively superimpose more versus areas in space where high frequencies dominate and the particle turbulence is increased.

My aspiration for this effect is that the particle behaviors, driven by audio, visually and qualitatively communicate aspects of the song, as if seeing some hyper-dimensional spectrogram of sorts, where the motion of your hands through space draws out or illuminates spots within the spectrogram. Memory of a certain sound being attached to the proprioceptive sensation of the hand’s location at that moment, in combination with the visual structure at that location being correlated with the past (or present) sound associated with that location, would reinforce the unity of the sound and the structure, making the system more coherent and useful as a performance tool.

In order to make it more apparent that movement along the x axis controls the audio state, I multiply up the color of the particles sharing the same x-position as the hand such that a slice of brighter particles tracks the hand. This simultaneously visually shows that something special is associated with x-movement, but also provides a secondary or tertiary visual curiosity allowing a sort of CAT-scan-like appraisal of the 3D structures produced.

Ideally this experience would be a bimanual affair, but I ran into difficulties of multiple fronts. Firstly, if each hand granulates simultaneously, it is likely to be too sonically muddy. Secondly, the math required to use two points (like the fingertips of both hands simultaneously) to influence the velocity of the particles, each in their own localities, proved to be insurmountable in the timeframe. This is likely a solvable problem. What is less solvable, however, is my desire to have two hand SDFs simultaneously affect the same particle system. This might be an insurmountable limitation of VFX Graph’s ‘Conform To SDF’ operation.

Functional prototypes were built to send haptic moments from Unity over OSC to an Apple Watch’s Taptic Engine. These interactions were eventually shifted to an alternate project. Further, functional prototypes of a bezier-curve based wire-and-node system made out of particles were built for future functionality, and not incorporated into the delivered project.

future directions:

  • multiple hands
  • multiple songs (modulated by aforementioned bezier-wire node system)
  • multiple stems of each song
  • different particle effects per song section
  • distinct particle movements and persistences based on audio
  • a beginning and an end
  • granulation volume surge only when close to particles (requires future VFX Graph features for CPU to access live GPU particle locations)

Special thanks to
Aman Tiwari for creating MeshToSDF Converter
Manuel Eisl for creating a Unity granulator
Keijiro Takahashi for creating the Low-latency Audio Signal Processing (LASP) plugin for Unity

 

 

 

points along the process

 

 

gray – telematic crit reflection

Perhaps the overall thrust of the piece was lost or unapparent when I showed three as-yet disconnected interactional elements, without the unification that would make the overall experience legible to the initial viewing in crit. Many of the critiques identified that the completed piece would likely be more understandable once all of its elements are unified.

The categories in the DAIE sequence are almost more useful for me in reading their collation than perhaps the writer themselves. Simple sections such as Description make apparent if I had communicated the actual content fully, whereas the act of writing those uneditorialized observations might be less of an opportunity for their own syntheses.

Sometimes certain fields were left blank, as if those were sacrificially difficult to complete amidst the conversation and the time constraint.

Some of the comments questioned if the piece was more an instrument to create melodies or a tool for exploring a prewritten melody. I built it for both, though I think that the method of granular synth allows a prewritten melody to be played as an “instrument”, as if the spatial structure and arrangement of the available notes afford the same ability of create melodies, even though their spatial pattern isn’t chromatic.

I received some good feedback that the sort of discrete structure-per-note situation with pianos etc might easily apply to this piece with multiple separate volumes with the full continuous mapping of each granulator, but each volume being discrete.

gray – check-in

much time spent on getting Apple Watch haptics functioning. Weirdness with OSC transmission. Works with phone hotspot.

Found a good visual distance communication, a circle enclosing on finger with radius equal to finger distance.

 

the overall goal is to allow the participant to draw out paths correspondent to a ambient song sound files, such that passing the hand through the now-floating and independently static paths replays that section of the sound file via a granulation script. Allows multiple limbs to be maneuverable “ears”.

gray-2Dphysics

 

Arc involves the use of Unity’s Visual Effect Graph compute shader particle simulator.

I’ve been giving thought recently to the usefulness/necessity of visually connecting structures to communicate the interrelation of material mechanisms. I was interested in exploring how connective visual structures might emerge out of disparate particles, with the eye toward eventually applying the work in 3D within a VR interface.

I’m quite happy with the results, discovering how mapping strength of the gravitational pull to the distance between the points produces emergent and lively structures that are perhaps continually surprising.

graycrawford.itch.io/arc

gray-LookingOutwards1

 

L U N E by Cabbibo (Isaac Cohen) is a meditative, interactive VR piece involving cyclic envelopment within shimmering fabric enclosures and the arrangement of solid structures that support them.

Intended as a poem rather than a game, L U N E can evoke emotions from glee to melancholy as you partake in exploring the dynamics of the world he created. It is obviously influenced by experiences from the comfort of childhood pillow forts to mature recognition of cosmic insignificance.

I am drawn to L U N E’s almost-entropic forward progression, where the narrative-arrow-of-time is enforced by a series of one-way actions, while still remaining serene and preserving freedom at any step. Further, the shaders combined with the physical dynamics are innately enticing, and every action’s sonic element unifies the environment satisfyingly.

However, L U N E is quite short. The cycle conceivably “completable” within a minute, there are few elements to provide substance beyond the inherent curiosity of the user, though the singularity is perhaps intentionally haiku-like and needn’t overstay itself. 

Created in Unity, with much code open-sourced, the experience provides material support for any works inspired.

https://store.steampowered.com/app/485880/L_U_N_E/