Up and Running with DepthKit!
Currently I have DepthKit Capture and Visualize installed on the studio’s Veggie Dumpling PC. It worked with the Kinect V2 nearly immediately (a small filepath error needed to be corrected), so DepthKit in the studio is now a go.
Further, after exporting the video from DepthKit Visualize (Visualize exists for Mac OS/OSX but is really not usable, at least on 10.12.1, as the UI is invisible), I successfully dropped the clip into Unity.
I am constructing a watertight box for the Kinect to begin underwater testing. In-studio testing has shown that high quality cast acrylic does not interfere with the sensing ability of the Kinect. I will tentatively use the CMU pool next week(meeting with aquatics director soon). Once I have a better idea about the sensing limitations in water, I will know what type of shots I am able to achieve. Further research into different types of underwater sensing lead me to be more optimistic; time of flight depth cameras have shown good results in papers like this.
Once the capture has been made, the question of the final media object remains. Now that I am able to get the clips into unity, I have the possibility of making this a VR experience. I have a feeling with the proper lighting and scene design it could be nice in the Oculus Rift. Else, I will make a video or series of gifs.
The box is constructed, but I still want to sure up the sealing. The area around the cord was not watertight despite my best efforts with silicone. May need to redesign that face of the box. (don’t panic, kinect is dry!)
Tests so far: splashes and ripples look pretty cool!
Submerged tests: mixed feelings on the results here so far; the actual depth being sensed seems pretty tiny. Definitely noisy. Looks more low-relief than anything else. That said, there is at least a visible output!
Working in the Browser!!!