Nreal, a developer of ready-to-wear augmented reality (AR) smart glasses, has recently announced an important update to its Hand Tracking API. The update features improved gestures and virtual object manipulation, including fast self-occluded hand movements. The company also stated that the update will help to ensure that developing apps will become easier with improvements to stability and accuracy with lower latency, enabling new interaction functionalities.
The update has been made possible thanks to what Nreal states are “significant advancements in hand tracking development by The Nreal Team through the use of recent, deep-learning based AI technologies.” The company noted that it has trained a specifically designed deep neural network using millions of images of hands and their corresponding labels, with the joints and tips as keypoints. The resulting network model first detects hands seen by the cameras on the company’s Nreal Light glasses, and then calculates the position of the hand keypoints in a 3D space.
From these calculations, Nreal’s algorithm is not only able to calculate the position and orientation of a user’s hands (known as pose information) in the virtual world, but is also able to recognize specific hand gestures. These include point, pinch, grab, open hands, victory/peace signs, along with thumbs up and down. This is all calculated in real time thanks to Nreal’s tailored deep-learning model and optimized hardware computations.
In addition to these improvements, Nreal’s Hand Tracking API has integrated with the company’s Mixed Reality Toolkit (MRTK) interaction toolkit, granting application-level hand control functions. Through using Nreal’s NRSDK and MRTK in combination, developers can generate instinctual virtual object manipulation and content control experiences for their applications. Nreal stated that as a result, this now also allows for apps developed on HoloLens devices with hand tracking to be used on Nreal devices, ensuring ease of device transfer.
Hand tracking falls under the umbrella of gesture recognition, and provides a sense of presence, control and world manipulation when using XR applications. It assists in reproducing gestures that people perform in real life for use when interacting with holographic objects in a virtual space. One of the benefits of using hand tracking over controllers in virtual spaces is that these interactions become more intuitive and efficient. The popularity of virtual reality (VR) headsets and the industry’s shift from controllers to hand tracking for some games has helped to demonstrate the popularity of hand tracking as an interaction tool for immersive experiences. Outside of entertainment use cases, sectors such as healthcare have also adopted hand tracking technology in order to provide more intuitive interactions for medical staff for example.
In its announcement, Nreal stated: “We are excited to share this step in our journey with you, as it opens up even greater possibilities for the future of AR development and the capabilities of AI. In comparison to the previous NRSDK, our new Hand Tracking API reduces interaction development cost alongside improved accuracy and stability.”
Nreal has developed two source applications that are now available to developers. To download, click here. For more information on Nreal and its augmented reality solutions, please visit the company’s website.
Image / video credit: Nreal / YouTube