Since the inception of virtual reality, gaming controllers have dominated as the primary means of interaction in the virtual world. This paradigm, however, has always felt somewhat alien—a barrier rather than an enabler. Users manipulate virtual hands using buttons and joysticks, creating an experience that feels anything but natural. Despite the promise of hand tracking, the industry has been slow to fully embrace it, often retrofitting traditional controller inputs, rather than pioneering hands-first interaction models.
Unity's introduction of the XR Interaction Toolkit (XRI) marked a significant step forward, offering a framework to support both controllers and hand tracking. However, this dual approach often results in compromises that diminish the potential of true hand interaction.
As the product lead on developer tooling at Ultraleap, my mission was clear: make it as intuitive as possible for developers to harness the power of Ultraleap hand tracking. Unity’s dominance in XR application development is undeniable, so we invested heavily in building a Unity plugin, which was designed to seamlessly integrate with this platform.
Over the three years I've been immersed in hand tracking development, it became evident that developers needed greater freedom to innovate. They need tools that empower, not constrain. To address these challenges, we developed Physical Hands—a revolutionary tool offering unprecedented flexibility in user interactions and virtual hand representation through three distinct ‘Contact Modes’: Hard Contact, Soft Contact, and No Contact.
Hard Contact: Redefining Natural Interaction
Hard Contact negates the need for instructions. Users can flick, push, pull, grab, and throw objects just as they would in the real world. This mode's intuitive nature minimises the learning curve, making it ideal for enterprise training scenarios. The visual alignment of the virtual hand with physical interactions maintains VR immersion, ensuring that every movement feels authentic and seamless.
Yet, while Hard Contact looks simple and realistic, it does present challenges when the virtual hand does not perfectly map to the user's real hand movements. This slight disconnect can sometimes feel contrived, but the benefits often outweigh the drawbacks, especially in environments where users need to quickly acclimatise themselves.
Soft Contact: Balancing Precision and Freedom
Soft Contact introduces a nuanced interaction model. Users can still push, pull, and flick objects, but picking them up requires a specific pinch or grab pose. This balance prevents unintended manipulations while offering freedom in interaction, a perfect blend for training applications. Soft Contact's 1:1 mirroring of real-world hand poses allows for a natural visual experience, even if the hand penetrates virtual objects.
This mode enhances user satisfaction by reflecting real-world movements accurately, providing a sense of full agency. For developers and users alike, Soft Contact's balance between control and freedom can be transformative, fostering more engaging and effective VR experiences.
This is actually the mode of interaction that was adopted by Leap Motion, formerly called 'The Interaction Engine'. We enhanced the code's functionality to align with our contact modes, so that it resulted in a more streamlined workflow.
No Contact: Simplifying Interaction for Specific Use Cases
No Contact streamlines interactions to their most basic form—grabbing and dropping objects. This mode excels in scenarios where users only need to pick up or drop items, such as procedural training. By only allowing for intended interactions, No Contact offers simplicity and a solution that works every time. Note that this option can frustrate users if they are expecting to be able to do more than grab and drop, however if onboarded properly, users should not have any issues.
Conclusion
For hand tracking to achieve mainstream adoption, it must emulate real-world hand movements with accuracy and reliability. Physical Hands can be a catalyst for this transition, empowering developers to create hands-first interactions that transcend the limitations of controllers. This shift towards natural interaction lowers barriers to entry and elevates user expectations, particularly in kinematic learning and training applications.
The reason I am promoting this tool is because I am confident that Physical Hands can be the tool for VR and AR hand tracking development because not only is it flexible, it is OpenXR compliant which means it can be used by anyone developing for OpenXR. Whilst Ultraleap is no longer actively developing this feature, it is open source on GitHub (as with its entire Unity plugin), so I am hopeful someone out there will use it in the future because it really has the potential to change how hand tracking application development for the developers.
Comments