The immersive healthcare solutions provider joins the next wave of input tech
This week, Osso VR, an immersive training platform for healthcare professionals, is introducing hand-tracking input options into its leading VR learning platform.
“Hand Control” for Osso VR’s healthcare training platform introduces new tracking options, which negates the need for a user to leverage a controller to navigate and interact with immersive training scenarios.
The “ground-breaking” feature allows healthcare professionals to partake in training scenarios with heightened immersion, thanks to hand-tracking inputs. Currently, Osso VR supports healthcare firms with XR scalability and delivery with a platform that promotes creating custom VR learning content.
More on Osso VR
Osso VR’s solution actively trains surgical professionals before conducting real-life surgeries in operating rooms and removes obstacles faced by educational models, such as travel, scheduling, and collaborative hurdles.
Surgeons can repeatedly train with bespoke modules to boost their efficacy, skills, and data retention using vital instructional tools while monitoring performance and receiving feedback.
Osso VR has numerous medical partnerships with major firms such as Johnson & Johnson, Zimmer Biomet, Stryker, and Smith + Nephew across surgical specialities, including cardiology, spinal, and general surgery.
Hand-Tracking to Rule XR Input?
The news of Osso VR’s inclusion of hand-tracking input comes as the immersive industry shifts gears to support body-tracking input over controllers.
2024 is the year of body-tracking. In the wake of Apple’s spatial computing showcase, which unveiled the Vision Pro, a device that doesn’t include a controller and opts for hand-tracking input, many firms are doubling down on support for the next generation of XR input.
Recently, PICO released its OS version 5.7 update, which enhanced the tracking abilities of the firm’s PICO 4 and PICO Neo 3 Link MR headsets.
OS 5.7 has “comprehensively upgraded” hand-tracking-based interactions using “improved smart algorithms” that reduce errors stemming from detection and tracking, such as virtual hand jitter and flicker. Moreover, PICO improves controller-less inputs with more accurate and stable motion tracking.
Additionally, Unity – the core SDK behind the Vision Pro – updated its XR Interaction Toolkit (XRI) with deep eye and hand-tracking developmental features. The news comes as Unity supplies SDKs to assist in porting Metaverse platforms, such as Rec Room, to the Vision Pro with hand-tracking input.
XRI 2.3 includes features such as:
- XRI Aim Assistance: This improves the usability of eye-based aiming and selections by spreading functionalities across controller devices.
- XR Gaze: A feature that allows application users to aim cursers with their eyes and select points of interest with a controller,
- XR Gaze Assistance: An additional component that auto-adjusts trajectories when a user throws in-application objects, improving accuracy.
- XR Input Modality Manager: A service that regulates the runtime of hands and controller visualisations when a user switches between the two.
- Reactive Hand Visuals: Providing interaction-reactive visuals that improve digital representations of a user’s fingers.
- XR Origin Hands: A feature that highlights individual fingers during an in-application interaction.
- Hand-Tracking features for its XR Device Simulator: A programme that imitates device-specific spatial environments for testing purposes, with a series of standard, pre-defined hand poses to support XR application development and testing.
Hand-tracking is gaining adoption across the immersive sector, from Meta to PICO to Apple. It is clear that all forms of body-tracking will play a role in XR.
As XR body-tracking input scales, enterprise end-users will also benefit from the next evolution of XR input, which significantly improves the usability, accessibility, and inclusion of XR headsets for end-users.