Designing UI for Hand Tracking
Hand tracking hardware like the HoloLens 2, Oculus Quest, and Leap Motion offer exciting new methods of input in VR and AR, but such a dramatic shift in paradigm presents new challenges and requires new metaphors for interaction.
In this free on-demand workshop, Circuit Stream instructor Eric Carter will explain many of the common pitfalls of hand tracking UI, and provide potential solutions to address each. He’ll also suggest design techniques that allow you to approach this frontier in creative and productive ways in order to identify and address problems no one has encountered yet.
Resources: Common Terms and Design Resources PDF
During the workshop, we'll share:
- An introduction to hand tracking and a criteria for evaluating the success of hand tracked UI
- The limitations of modern hand tracking hardware
- Common methods for circumventing those limitations
- Answers to your questions about designing hand tracked UI
At the end of the workshop, you'll have:
- A better understanding of what makes a good hand tracking UI
- A wide knowledge of industry standard interaction models
- A set of techniques that improve your ability to come up with your own hand tracking solutions
- A strong understanding of what features and debugging tools you’ll need to build into your hand tracking designs
- A familiarity with VR and AR
- Interest in building software for hand tracking devices
- An AR/VR device is not required for this workshop
About Your Instructor:
Eric Carter is an XR designer who helped Microsoft ship hand tracking on the HoloLens 2. After that he worked on Facebook Horizon with the team that made Oculus Quest First Steps. With more than a decade of prior experience working in game design, Eric brings a creative and user-centric focus to the projects he develops.