Hand Physics Toolkit (HPTK) is a toolkit to build physical hand-driven interactions in a modular and scalable way. Hand physics and hover/touch/grab detection are modules included. This toolkit can be combined with MRTK-Quest for UI interactions. Only Oculus Quest is supported at the moment.
- Data model to access hand components and lerp values to compose gestures and trigger actions.
- State-of-the-art hand physics that can be configured in detail through configuration assets.
- Hover/Touch/Grab detection with support to interactions involving multiple objects and hands.
- Code architecture based on isolated modules. Support to custom modules. (Wiki).
- Input abstraction. RelativeSkeletonTracker included to mimic other hands. (Wiki).
- You can clone a ready-to-go project at HPTK-Sample.
- Unity 2019.4.4f1 LTS, 2019.3.15f1
- Oculus Integration 20.0
- Oculus Quest - Android
- Universal Render Pipeline (URP)
- Obtain HPTK.
- Import Oculus Integration.
- Configure Build Settings (Oculus Quest).
- Configure Project Settings (!).
- Setup a scene with hand tracking support (Oculus Quest).
- Setup HPTK specific components.
- Setup platform specific HPTK components (Oculus Quest).
- Modify/Create HPTK Configuration Assets (if needed).
Checkout the Wiki for a detailed step-by-step guide.
The Wiki also includes more details about:
- Modules overview.
- Getting started with HPTK.
- How to build new HPTK modules.
Jorge Juan González - HCI Researcher at I3A (University of Castilla-La Mancha)
Oxters Wyzgowski - GitHub - Twitter
Michael Stevenson - GitHub
Nasim, K, Kim, YJ. Physics-based assistive grasping for robust object manipulation in virtual reality. Comput Anim Virtual Worlds. 2018; 29:e1820. https://doi.org/10.1002/cav.1820
Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016 https://blogs.microsoft.com/