Design & Art Direction
Selected Interaction Design, User Experience, UI, Design Research and Art Direction
Interaction designer, researcher & videographer, In collaboration with colleagues at Microsoft Research Cambridge.
Microsoft Research's computer vision experts developed a new hand-tracking system which accurately reconstructs complex hand poses using only a single depth camera. The system can track sophisticated and nuanced hand motions, in real time.
Fully articulated hand tracking promises to enable fundamentally new interactions with virtual and augmented worlds. We designed and built virtual controls and interaction experiences to test the capabilities and boundaries of the system and to enable new possibilities for human-computer interaction. We presented this work at SIGGRAPH 2016, this research would later be transferred into Hololens 2.
Recent rapid advances in display and head-tracking technology is at last bringing virtual reality and augmented reality to the consumer mainstream. But perhaps just as important as the ability to display virtual objects to the user, is for the user to be able to interact with those virtual objects and environments as naturally as possible.
In the real world, we use our hands to reach out and touch objects, which react to our interactions according to the laws of physics. Can uninstrumented hand tracking allow natural interaction with virtual objects in a physically plausible manner?
We demonstrate a number of diverse user experience prototypes. These include examples of discrete gesture recognition (such as a thumb click to engage or disengage), retargeting the hand to new 3D hand avatars, painting, pointing, 3D thumb stick control, and the exciting frontier of physics based interaction, including: prototypes of piano playing and typing, controlling a marionette via virtual strings attached to the fingers, deforming a 3D model, scratching records, and interacting with a 3D GUI through pure physics. Most of these experiences were built in Unity hooked up to theOculus DK2 VR headset and using a Kinect V2 camera for input.
An ongoing concern regarding virtual object manipulation is the importance of haptic feedback. We believe there are several ways to address this. First, with the stereo disparity cues offered by VR headsets, it is easy to position your hand relative to virtual objects, and visual and auditory cues can then indicate interaction events to the user. We were genuinely surprised how well this works and how natural this feels. Second, one can design the experience such that affordances are thin and the user ends up making a pinch. Finally, research in in-air haptics continues apace.