- Welche Entwicklungen im Umfeld VR/AR und Metaverse Learning /Working werden 2023 eine Rolle spielen?
- Wie sehen mögliche Trends 2023 VR/AR Learning / Working aus?
- Was wird mit „dem“ Metaverse passieren?
- Wie sehen die möglichen Herausforderungen aus?
- Wo sehen Sie Einsatzszenarien und Umsetzungen?
MR
Microsoft has recently announced that it is expanding the cross-platform interoperability for its Mixed Reality Toolkit 3 (MRTK3) to now support Qualcomm’s Snapdragon Spaces platform.
According to Microsoft, cross-platform tools such as MRTK3, which work across the full spectrum of mixed reality (MR) devices, are important to making app creation less fragmented. This reduced fragmentation helps to make things less time consuming for mixed reality app and game developers, who often use a broad range of tools and hardware to support their solutions and creative vision, including for augmented or virtual reality (AR/VR) projects.
Microsoft stated that the growing adoption of OpenXR has been key to moving towards a more builder-friendly MR ecosystem. OpenXR is an open royalty-free API standard from The Khronos Group that provides native access to a wide range of devices from many vendors across the mixed reality spectrum. Because MRTK3 is built natively on OpenXR, it is highly portable across OpenXR-based devices including HoloLens 2, Meta Quest 2, Magic Leap 2, and Ultraleap, among others, according to Microsoft.
This latest expansion of MRTK3’s interoperability means that developers using MRTK now have more places to land their applications with little to no platform-specific code, according to Microsoft. Snapdragon Spaces enables developers to build immersive applications for AR glasses from scratch or add head-worn AR features to existing Android smartphone applications. Because it is conformant to the Khronos OpenXR runtime specification, Microsoft stated that developers will find that many MRTK3 features work out of the box.

Developers targeting Snapdragon Spaces can now use MRTK3 Public Preview to build rich and expressive volumetric UI, like touchable sliders, buttons, toggles, and more. In addition, MRTK3 makes it easier to build performant applications with highly optimized shaders and rendering tools specifically tuned for mobile devices, according to Microsoft.
The company added that in the near future, when MRTK3’s full range of capabilities is implemented on Snapdragon Spaces, developers will be able to do even more with the toolkit. This will include data binding, theming and a more straightforward way to implement object manipulations, like grabbing and resizing 3D objects.
In a blog post on the announcement, Microsoft stated: “We are thrilled to see Qualcomm and so many other platform and device makers coalesce around open standards. This will ensure that developers’ investments in OpenXR-based tools will remain valuable even as new devices and platforms emerge. More importantly, reducing platform fragmentation will allow developers more time to innovate, solve problems, and delight users.”
For more information on Microsoft’s MRTK3 solution for cross-platform mixed reality development, please visit the company’s website.
Quelle:
Foto: Qualcomm’s Snapdragon Spaces platform includes many features for building various types of AR experiences.
Image / video credit: Microsoft / Qualcomm / YouTube
Here’s the definitive guide to XR technology and its multitude of use cases
XR Today is tackling the confusing subject of extended reality (XR). This up-and-coming technology seamlessly blends virtual and augmented reality (VR/AR), namely for enterprise use.
Additionally, immersive firms leverage XR to build solutions for engineers, soldiers, medical professionals, and even pilots. These professionals use new devices to merge the physical and virtual worlds with creative, engaging immersive content.
According to Precedence Research, the global XR market reached $35.14 billion USD this year. It is also expected to top $345.9 billion by 2030 with a compound annual growth rate (CAGR) of 33.09 percent.
Following the ongoing COVID-19 pandemic, XR saw a tremendous acceleration in adoption and new solutions across industry verticals. Sectors such as retail, medical, remote guidance, and even the military leveraged the technology to circumvent the challenges of COVID-19.
Here are some reasons businesses worldwide turn to XR to digitally transform their operations.
Defining XR
XR blends the physical and virtual worlds with digital overlays that interact in real-time with real-life objects. This, combined with cutting-edge hand and eye tracking, combine real-time 3D (RT3D) content with a person’s field of view (FoV).
Most recently, companies have begun unlocking potential use cases for XR. This includes Varjo’s XR-2, Microsoft’s HoloLens 2, Pico Interactive’s Pico 4 Enterprise, and Meta Platform’s Quest Pro. Also, Apple’s long-awaited headset is expected to release in 2023, following ongoing revisions.
Along with hardware solutions, these companies are creating their own comprehensive solutions for their respective headset lineups. This allows for simultaneous mobile device management (MDM), rapid deployments, and increase versatility for independent software vendors (ISVs).
The Three Types of Extended Reality
There are currently three forms of extended reality, each defined by the level of interaction between the virtual and real worlds. Here’s a closer look at each:
Augmented Reality (AR)
Augmented Reality is the most widely-used XR format and expands on the real world with digital overlays. Devices can use them to project documents, images, text and video streams on a user’s FoV.
Comparatively, AR is the most accessible of the three as people can readily use it on their smartphones. This creates a broader adoption rate than VR and MR, which requires headsets to function.
Global audiences experienced AR first-hand after Niantic’s debut of Pokémon Go. Conversely, at the enterprise level, AR has evolved drastically, including:
- Delivering real-time guidance and information to professionals for better productivity
- Offering hands-free assistance to staff by streaming information to smart glasses
- Generating stronger communicative and collaborative immersive experiences
- Boosting retail interactions with ‘try-before-you-buy’ experiences for customers
- Offering remote guidance with service, support, and walkthrough experiences.
New devices like the Magic Leap 2 are set to completely disrupt the AR market with cutting-edge technologies for enterprise use cases.
Virtual Reality (VR)
Virtual Reality immerses the user completely in unique, interactive virtual spaces. Unlike AR, people using VR devices employ fully-immersive headsets, base stations, sensors, and controllers. Some devices also tether to PCs with high-end graphical processors for optimal functionality.
VR first rose to fame in the gaming industry to offer gamers immersive experiences tethered to gaming consoles. Sony’s PlayStation VR (PSVR) and upcoming PSVR 2 are examples of such designs.
With today’s modern VR devices, users can:
- Safely train employees to enter dangerous work environments
- Create virtual meeting rooms for collaboration between teams
- Improve efficiency and productivity in product design processes
- Enable the quick prototyping and discovery of new product ideas
- Create unique consumer/brand experiences for major companies
Mixed Reality (MR)
Mixed Reality is where things start to get a little confusing for most in the XR landscape. With MR, users can join holographic meetings with colleagues or interact with digital twins of a product. They can also design products collaboratively, in the same immersive space and in real-time.
One of the best examples of this is Varjo’s XR-3 headsets with Autodesk VRED and Reality Cloud solutions. Despite MR remaining in its early stages, firms are developing a growing ecosystem of solutions for current headsets.
This specific format can:
- Align teams across geographies using holograms
- Create immersive product-building and design experiences
- Enhance training and development strategies
- Offer real-time guidance and support to staff in dangerous situations
- Change the way users interact with machinery in the modern world
What Can Extended Reality Do?
Developers are just scratching the surface of XR. Research suggests around 60 percent of people believe XR will become a mainstream environment in the next five years. XR’s meteoric rise is also expected to empower businesses to develop metaverse solutions for enterprise users.
By combining virtual and real landscapes, XR can build use cases for:
Customer Experiences (CX)
Furthermore, with AR, companies can provide customers with try-on experiences for clothing, furniture, motor vehicle parts, and other products before purchasing. Also, VR can engage customers with immersive demos and provide customers with support and service walkthroughs.
Training and development
XR has become synonymous with education, with many companies leveraging the technology across universities, human resource (HR) departments, and even military bases.
All formats across XR have become increasingly vital for upskilling, training, and building diversity, equality and inclusion across workforces (DE&I). Companies like Moth and Flame or Oberon Technologies offer on-demand, bespoke training for real-life skills in dangerous work environments.
Remote Work
Workers can now connect easily with other teams and collaborate in real-time with immersive platforms. With MR and AR, specialists can even guide team members remotely through complex tasks, like how to fix a problem on a piece of manufacturing machinery.
Marketing and Sales
Firms are even exploring opportunities to engage customers with XR technologies. Many use VR showrooms to debut new products and build immersive worlds with competitions and campaigns.
Entertainment and Events
VR events are quickly trending across the industry, with enterprise-ready solutions like Arthur, AltSpaceVR, rooom, Horizon Worlds, and Improbable building future event platforms.
What Will the Future of XR Look Like?
In addition, XR has gained more attention, but there are still challenges to overcome. Companies investing in extended reality’s future will need to carefully consider how to tackle issues like:
- Expensive hardware: While affordable XR tools are emerging across the current market, the most advanced ones are still rather expensive.
- Consumer comfort: Clunky, uncomfortable, and disorienting headsets can make spending more time in XR difficult. Consumer comfort is essential for the future of XR development.
- Security and safety: A lot of the benefits of XR rely on tools to protect data and information. Developers will need to address safety and security to facilitate this.
The XR environment is ideal for offering everything, from remote assistance to immersive collaboration. With emerging technologies like 5G, artificial intelligence (AI), and blockchain, XR is expected to strengthen and push the boundaries of the immersive experience.
Quelle:
XR Today takes a closer look at the suble differece between the two immersive mediums
Companies exploring extended reality (XR) for productivity, creativity, and collaboration, are expanding its use cases. Many firm navigating the subtleties of mixed reality vs augmented reality may wonder how each are different.
Learning the difference between mixed reality vs augmented reality (MR/AR) is crucial. This will allow companies to invest in the right digital medium to benefit targeted objectives.
While many people have a general understanding of AR and MR, they struggle to tell the difference. For instance, both AR and VR experiences involve blending the virtual and real worlds. Many people also think that MR is just another term for AR.
The reality, no pun intended, is that MR and AR are two very different concepts.
Here’s a closer look at the two.
AR and the Digital Overlay
AR is one of the more well-known forms of XR. With AR, users can overlay digital content on the real world.
In consumer markets, AR Snapchat filters can add puppy ears to a person’s face with smartphone cameras and face-tracking software.
Augmented reality has become extremely valuable today due to its accessibility. Tools like Apple’s ARkit and Google’s ARCore allow users to develop their own immersive experiences for smartphones and immersive devices such as smart glasses.
These overlay digital content in real-time via wavefinders on the smart glasses, allowing people to see content overlayed in their field of view (FoV) at all times.
Companies such as Qualcomm are facilitating this even more with their ground-breaking Snapdragon AR2 platform, which provides essential processing kit for smart glass developers.
AR is quickly making its way into a variety of settings. Retailers use it to help customers visualize a product before they buy it. Engineers also employ AR to access valuable information about a product without fumbling with physical manuals.
Companies such as Taqtile and Kognitiv Spark are saving immense amounts of time, money, and manpower with remote guidance systems capable of resolving problems or walking employees through critical fixes.
This achieves reduced equipment downtimes with rapid repair turnaround times, even if staff are thousands of miles away from headquarters.
As smart glasses become more consumerised and affordable, the demand for AR is growing.
MR: When Digital Worlds Collide
So, what makes MR different from AR? Firstly, it goes further than AR when it comes to immersion.
In MR, immersive content is not just superimposed on the real world. It is designed where users can interact with it seamlessly.
This form of MR deepens immersion by boosting interaction with the real world, using numerous forward-facing cameras.
In another form of MR, digital environments replace the real world by fully immersing users with a headset, similar to VR.
These are similar to the full-colour passthrough capabilities on the Meta Quest Pro and Pico Interactive 4 Enterprise, which feature cutting-edge technologies for professional use.
MR can leverage both VR and AR, blending both seamlessly with digital overlays and content. This creates an immeasurably interactive virtual experience for tech enthusiasts.
Apple is also set to release its MR headset with a tentative date of early 2023, but may delay up to 2026 due to design flaws.
Despite this, the device may impress Apple fans worldwide with immense M1 processing capabilities and next-generation passthrough tech.
People are now capable of creating an entirely new reality by combining both the physical and digital environment.
Exploring AR and MR
For many companies, AR will be one of the easiest forms of immersive technology to grow across the industry.
It is accessible because people can create applications and tools that work in smartphones, smart glasses, and headsets. However, as hardware solutions continue to evolve, so will MR.
Many leading companies are already exploring MR. In the future, MR could create brand new tools for collaboration, productivity, and interaction.
VR and AR can significantly improve humanity’s quality of life and work. Tech companies will release future MR solutions merging the two for the benefit of all.
Quelle:
Vrgineers, a provider of next-generation virtual and mixed reality (VR/MR) pilot training systems, and DigiFlight, a provider of cybersecurity, IT, aerospace, and training solutions with extensive worldwide pilot training experience, have this week announced that they have partnered to produce a modern and comprehensive training solution for the Apache multi-role attack helicopter.
With over 1,200 aircraft currently in service, the Advanced Mixed Reality Apache Trainer (AMRAT) will accelerate training pilots with high fidelity MR solutions and therefore significantly reduce the number of aircraft hours normally required to support training individual and crew tasks, according to the companies.
The AMRAT is built on a proprietary portable platform designed by Vrgineers, which was originally created for the US Air Force (USAF). The platform provides an affordable, immersive, and realistic experience without requiring expensive visual display systems, complicated support systems, or unique facility design requirements.
The trainer includes two separate but interconnected crew stations, (the pilot and co-pilot gunner station), with replicated switches, panels, flight instruments, and displays connected to a computer and integrated VR image generator. Currently, the platform runs on immersive flight simulator software provided by ED Mission Systems.

“The possibility to conduct a seamless mixed reality environment with a portable haptic flight seat run by a true-to-life simulation engine ensures the pilot’s proficiency, but most importantly delivers the sensation of a unique and genuine experience encompassing complex operational scenarios. This enables our users to save time and effort along with minimizing the cost of training infrastructure and related running hours,” said Matthias Techmanski, Director at ED Mission Systems.

The novel training solution is the result of the cooperation between Vrgineers, and DigiFlight. Commenting on the partnership, Marek Polcak, CEO of Vrgineers, said: “We strongly feel that by combining our skills, we are able to deliver a training device capable of supporting Apache training tasks, including individual and crew skills, preparing trainees to operate in any environment, and fulfilling critical mission requirements.”
Polcak added, “This type of modern trainer is expected to play a crucial role in pilot training in the future. Not only for Apache, but for Defiant (LMCO & Boeing) and Valor (Bell), which are currently being developed and competing in future vertical lift programs in the US.”
Vrgineers and DigiFlight will be jointly presenting their conceptual MR Apache Trainer, at I/ITSEC in Orlando, Florida next week. For more information on Vrgineers and its VR and MR pilot training systems, please visit the company’s website.
Quelle:
Image credit: Vrgineers
Arthur, a virtual reality (VR) office provider that enables enterprises to create large-scale virtual offices with fully immersive and collaborative environments, has this week announced the arrival of ‘New Realities’ to its virtual office solution.
The New Realities update unlocks the first generation of mixed reality (MR) features in Arthur’s virtual offices. These features allow users to bring a physical desk and keyboard into VR and access the global passthrough feature showcased by Meta at its Connect 2022 conference. The update from Arthur comes only two weeks after the company’s platform was also featured during the Meta event, which highlighted how Arthur’s VR office solutions make MR features available to users.

Arthur states that it has been building a strong product, team, and understanding of what enterprises need from a VR office solution and today provides a range of Fortune 500 businesses and large-scale organizations with a VR office solution focused on enabling the best remote productivity. Arthur’s clients include Societe Generale and the United Nations, who leverage the company’s VR solution to reimagine work to suit the changing world and employee needs. Arthur added that the New Realities update will unlock a step change by opening an entirely new suite of possibilities.
“As a company with a mission of transforming the way we work, mixed reality is an essential next step in our product’s evolution,” said Arthur’s Founder and CEO, Christoph Fleischmann. “With mixed reality, I have more control over the level of immersion I want for my meeting, and I can easily bring physical tools, such as my keyboard or desk surface into a VR meeting, in order to be more productive and connected. The Next Realities update will help us unlock the next frontier in enabling a future of work that is better and apt for our current world and needs.”
Alongside the New Realities update, Arthur has also announced an upgrade to ‘Arthur Consumer’, a lite version of its enterprise solution. With the new Consumer experience, Arthur wants to enable small teams, entrepreneurs, and individuals to get a taste of its VR office solution.
The upgraded experience will offer three rooms built on three of Arthur’s latest office environments to every registered user for free. Key updates announced for the Arthur platform include:
- New Realities: In addition to virtual reality, Arthur’s solution will include mixed reality setups;
- 1st generation of MR features: Users can use their physical desks and keyboards in Arthur. The global passthrough feature will also be available on supported devices;
- Quest Pro: Arthur’s virtual office application is now available on the newly launched Meta Quest Pro device;
- Upgraded consumer experience: The Arthur Consumer solution will offer three rooms, built on three different environments, for free to all its registered users.
Arthur states that its platform helps to address the needs of a wide variety of global industries, including energy, pharmaceuticals, consulting, insurance, and finance. The company also noted that it is continuing to gain traction, with usage continuously increasing. In 2021 alone, enterprise users have spent a total of more than 1.6 million minutes in Arthur. The company is now anticipating that this latest ‘New Realities’ update, as well as upcoming productivity updates, will offer even more promise for the future.
For more information on Arthur and its VR and MR solutions for collaborative environments, please visit the company’s website.
Quelle:
https://www.auganix.org/arthur-announces-new-realities-update-with-mixed-reality-features/
Image credit: Arthur / YouTube
These Meta Quest 2 passthrough experiences blend the real world with the digital.
Passthrough uses the Quest’s external cameras to provide real-time visualization of the physical world. Last year, Meta made their Passthrough API available to developers who have used it to create a range of different apps that blend virtual objects with your real-world surroundings.
There are a few games in the official Quest Store like Cubism, Gadgeteer, and Blaston thatfeature a passthrough mode. Whilst it can be fun to experience these games using the passthrough option, it’s included as an added extra that’s non-essential to the game.
In this article, we will only be looking at those games and apps where passthrough is essential to the core gameplay. The experiences in this list have all been designed specifically to be used with the Quest’s passthrough feature.
Because dedicated passthrough experiences are quite new and experimental, the games and apps featured on this list are available from SideQuest and App Lab.
The World Beyond
Price: Free
Whilst this experience is only a short, five-minute tech demo developed by Meta to showcase the potential of their Presence Platform, it offers the perfect introduction to passthrough and mixed reality on the Quest.
The demo starts with a brief setup process that requires you to virtually map the walls and furniture in your surroundings. You’re then given a mechanical device to help you locate a little alien creature called Oppy and the energy orbs that she eats, which you’ll find hidden around your play space.
Oppy responds to some voice commands and will react to being petted with your hands. She will also respond in clever ways to the environment by navigating around any mapped furniture or obstacles in her way.
However, the real magic begins when you use the device to transform your walls into portals to another dimension. Seeing Oppy happily jump between worlds as she chases the energy orbs you launch through the portals provides a wonderfully trippy experience that will give you a taste of mixed reality’s potential.
Dungeon Maker
Price: Free
Create a dungeon in your very own living room, complete with devious traps, monsters, and basic quest objectives. Dungeon creation works through a simple drag-and-drop system, with an added feature that allows you to link multiple dungeons together to create a series of stages.
Once your dungeon is complete, challenge yourself or a friend to navigate the treacherous gauntlet. It’s a uniquely fun experience, walking around your house and ducking to avoid the razor-sharp blade swinging from your ceiling or edging past the deadly spike pit which was once the living room rug.
Whilst the number of items, enemies, and quests to choose from is currently very limited, there’s still plenty here to whet your appetite.
PianoVision
Price: Free
Why pay hundreds in piano tuition when you can learn for free with PianoVision? This app combines passthrough and hand tracking to deliver a fun, flexible learning experience.
PianoVision works best with a MIDI keyboard but there is also an Air Piano mode that allows you to play on any flat surface, with no piano required. Start by either connecting your Quest directly to your MIDI keyboard via a cable or through WiFi using a PC and the Desktop App.
Once you select a song, the app overlays colored bars onto your keyboard to highlight the keys you need to play, exactly when you need to play them, to belt out the selected tune. It even has a hand-tracking feature that will identify which finger you should be used to press each key.
If you make a mistake, miss a note, or stop playing, the app allows you to catch up by pausing the program until you play the next correct note. PianoVision also includes sheet music, lessons, virtual environments, and the option to upload your own custom songs.
Custom Home Mapper
Price: $7.99
Turn your home into one giant mixed reality playground with Custom Home Mapper. Begin by virtually mapping the walls and furniture of your playspace, which can range in size from a single room to an entire floor.
Then select from one of twelve mini-games. Each game automatically transforms your walls and furniture into virtual obstacles, texturing them in a way that matches the game being played. For example, there’s a golf game that textures the floor grass green and converts your furniture into barriers like brick walls and plants.
Other mini-games include a shooter similar to SUPERHOT and a passthrough version of Snake. The games on offer are good for a quick bit of fun and do a great job of showcasing the variety of innovative ways passthrough can be used.
Hauntify Mixed Reality
Price: $4.99
Bring your nightmares to life by getting evil spirits to chase you around your own home! Hauntify allows you to turn a space of up to 500 by 500 meters into a mixed reality horror experience. It will even let you mark out and play over multiple floors, but caution should be exercised if using passthrough whilst taking stairs.
Once set up, the passthrough visuals dim, making it hard to see your real-life surroundings and forcing you to navigate your house using the virtual torch provided. It’s then your job to collect the relics that randomly spawn around your home in an effort to expel the roaming spirits.
The spirits themselves are equipped with fully dynamic AI to help them navigate your house and hunt you down. Whilst some spirits are more dangerous than others, they are all equally terrifying. The grainy black and white visuals of the Quest passthrough are typically a drawback, but it actually works well with this game by enhancing the grim and forbidding atmosphere.
Saber City
Price: Free
If you have a friend with a Quest and are looking for something fun to play together, why not invite them over for a passthrough duel with Saber City? Duelers need to be located in the same space before they can battle it out using a selection of ranged and melee weapons.
Players can see each other using passthrough, and are equipped with a virtual weapon, shield, and helmet. Dueling is a simple affair of using ranged and melee attacks to take out the physical opponent opposite. Just make sure you don’t get too close to each other in the heat of battle otherwise you’ll end up taking real-life damage.
The Future of Passthrough
It’s still early days for passthrough and mixed reality, with a lot more to come. For example, it was only recently announced that Schell games will be releasing the short mixed reality experience I Expect You To Die: Home Sweet Home this October 25th for Meta Quest 2 and Quest Pro platforms.
Resolution Games have also been working on an update for their fantasy VR board game, Demeo. The update, which you can see here, will use passthrough to allow players to place the virtual board on a real-world surface. According to the trailer, items like the virtual dice will interact with your home environment by being able to fall off furniture and bounce off surfaces.
The number and quality of these passthrough experiences are only set to grow, especially with the coming release of the Meta Quest Pro which features full-color mixed reality capabilities.
Quelle:
Image Credit: Field Of Vision
National Highways traffic officers play a critical role in keeping the nation’s road network moving. Tasked to deliver all three of the agency’s imperatives; safety, customer service and delivery, they are on the frontline of the road network, attending to and dealing with incidents as every day as debris on the road and vehicle breakdowns, to major road traffic collisions and significant disruptions.
Needless to say, the average National Highways traffic officer does a lot of driving. The agency patrols over 3,500 miles of our roads day and night, and officers need to be able to respond to anything that might come their way. For that reason, all officers need to know and be comfortable executing an array of professional driving manoeuvres and techniques that are way beyond the requirements of most drivers. From driving safely along the hard shoulder of a motorway to navigating their way through gridlocked traffic and even performing moving roadblocks, the job requires a significant level of individual skill.
This presents a problem. Just how do National Highways provide opportunities to keep their officer’s skills sharp without causing unnecessary expense to the taxpayers or causing too much disruption to other road users? That is exactly the issue that MXT is working on with National Highways, developing a state-of-the-art Mixed Reality driver training simulator that can help prepare officers for what they might encounter on patrol.
“The challenge that National Highways have is that most of their most important scenarios are hard to replicate in the real world for various reasons,” explains Josh Thompson, MXT’s Programme Manager who is overseeing the development of the simulator. “You can set up a conventional “roleplay” situation, where you close a road or practice on private land, but this can be costly and limited. To reduce or eliminate real danger, this role play is commonly done in a “sterile” environment without, for example, traffic. Of course, you can teach the theory in a classroom, but that can’t really replicate the stresses of being out on the road. And then you can undertake a combination of the two which National Highways currently does during their traffic officer training.”
TOOLS FOR THE JOB: VIRTUAL REALITY VS MIXED REALITY

Virtual reality has a lot going for it as a technological solution for training. A marriage between the practical realism of on-the-job training and the theoretical foundations of in-person classroom or roleplay learning, VR driving sims can be detailed enough to build skills and flexible enough to provide users with valuable experience in thousands of real-world situations. For an organisation like National Highways, which needs its traffic officers to respond to various challenges in almost any set of atmospheric and environmental conditions, VR offers a compelling mix of authenticity and flexibility.
“The efficacy of Virtual Reality training is undeniable,” explains Josh. “There’s a decent body of research showing how immersive learners are 4X faster to train than classroom-only learners. The beauty of using Virtual Reality and motion platforms (simulator rigs) for driver training like this is that it allows the user to learn by doing, but it also allows the trainer to pass on the benefits of their expertise and experience.”
For MXT, though, the challenge is how to develop a simulation that mixes the attributes of the available technology in a way that can deliver National Highways’ needs most effectively and appropriately. “There are essentially two options for setting up a simulation,” says Josh, framing MXT’s response to the brief. “At first, we looked at more a traditional solution, using flat or curved screens surrounding a user. In that instance, we found that the ability to interact with the simulation was really good because the user could use their hands to move the controls and seamlessly interact with the vehicle. But on the other hand, this option doesn’t have the immersivity or the “presence” of the user in the simulation. In fact, even using a VR headset that completely envelopes the user’s field of view has its compromises. It may help the user to feel like they’re present in a virtual world, but their ability to seamlessly interact with the physical equipment, like pedals, is now a bit limited.”
The need for users to suspend disbelief and engage fully with the simulation led to MXT putting Mixed Reality technology at the heart of their approach. Using mixed-reality VARJO XR-3 headsets (https://www.varjo.com/products/xr-3) to merge the real and virtual worlds; the solution literally puts the user in the driver’s seat, allowing them to interact with physical and digital objects in real-time. “What MR allows you to do is split the difference and take the best of both approaches,” adds Josh. “It puts the user in the car and allows them to interact with it as they would normally, but it also allows the trainer to control a simulated scenario, whilst keeping that vital sense of presence that is key to effective experiential training.”
MIXED REALITY ROLE-PLAYING

The power of this hybrid, extended reality approach is that it creates an experience that is as close to the real thing as possible. Study (https://www.tandfonline.com/doi/abs/10.1080/00220671.1986.10885679) after study (https://eric.ed.gov/?id=EJ1150290) has shown that people who learn through active participation achieve better outcomes than learners who stick to more linear forms of teaching. That’s why roleplay forms a vital part of modern workplace training: from the fairly mundane first-aid inductions most offices run for their staff to the more sophisticated simulated court cases many legal firms use to prepare their barristers for the real thing.
“It’s important to remember that Traffic Officers work in high-pressure and high-stress environments and training plays a vital part in helping officers navigate this,” Josh continues, highlighting the critical role immersion plays in MXT’s driving simulator. “We need to ensure the experience is as immersive as possible so that these experiential learning opportunities come into play. We need to recreate the pressures of the job and have the user believe, even subconsciously, that there is a sense of jeopardy in all this.”
To achieve this, MXT is creating a custom experience that combines an MR headset, a motion rig and immersive sound design to put the user right behind the wheel. Once strapped in, a trainee will find themselves right behind the wheel of a National Highways’ Traffic Officer Vehicle (TOV), with access to all the controls, switches and buttons found in the real thing. From there, the user can navigate a series of different road layouts and safely practice all manoeuvres and techniques they’ll need in the real world.
“Realism is the name of the game here, so we want to create an experience that approximates the real situation as close as we can,” says MXT’s Lead Programmer, Cat Flynn, outlining the team’s main technological hurdles are navigating as part of the build. “At the most basic level, if a driver needs to reverse and do that using only their side mirrors, we want them to be able to do that accurately. We need to get well into the weeds on this one and really understand what they need to do, how, and in what circumstances.”
WHERE THE RUBBER MEETS THE ROAD
To ensure that National Highways have a tool that can help prep their officers for almost anything, MXT has sought to make an experience as accessible and flexible as possible. This philosophy runs through the entire project. In terms of hardware, the Varjo XR-3 headsets, for example, ensure that the experience is as detailed and immersive as possible. Not only do they have a great field of view and excellent resolution, their forward-facing cameras allow for real-time streaming of “reality” that can be seamlessly blended with CGI. Additionally, MXT is collaborating with Motion Systems (www.motionsystems.eu) to create a state-of-the-art motion platform replicating the conditions inside a National Highways’ TOV.
“From a development perspective, it’s a challenge to get all elements to work in harmony,” Cat explains. “With a Mixed Reality simulator, you’re effectively both fitting a virtual world into the real one and bringing parts of the real world into the virtual.”
MXT’s in-house software has been designed with flexibility and utility, in mind, in order to deliver the right service to the client quickly and effectively. This means, for example, that the team can swiftly build accurate sections of UK road as needed, while the in-built traffic management and environmental tools allow the recreation of most conditions. The result is a training solution that is extremely malleable, able to render large numbers of different scenarios and incorporate various levels of hazard.
Right at the very heart of the project are the users themselves. While the simulation can deliver realistic scenarios that recreate most conditions, true effectiveness stems from its ability to break down the barriers between trainer teaching and trainee executing. Throughout the process, MXT is working with Traffic Officer Trainers to ensure that they feed back and shape the exact functionality they need. “Our ambition is to create a general purpose tool that allows the trainer to craft their scenarios as much as possible. What this tool will be able to do is effectively replicate real-world situations, such as an overturned lorry or other common scenes”, Josh continues. “These events are not always your average Tuesday for a Traffic Officer, but neither are they completely out the realms of possibility either, and I think the power of this tool is being able to strike a balance between the everyday experience of the user while providing enough immersion to demonstrate the importance of a job done well.”
Further reading on scenario simulation in VR for training purposes is available in the below links. If you would like to know more about how VR simulation can help your business, please get in touch.
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0204494
https://www.frontiersin.org/articles/10.3389/frvir.2021.645153/full
Quelle:
The difference between virtual reality (VR), augmented reality (AR) and mixed reality (MR) can be confusing. Many people hear the letters and form their own assumptions of how they relate to each other. Some may even believe they are interchangeable terms: They’re not. AR, VR and MR have distinct differences that, when harnessed, can lead to more effective training.
Let’s take a look at the three technologies in more detail. By learning about the benefits and features of each, you’ll be able to determine the best immersive learning technology for your company.
What is VR?
When considering immersive learning technologies and tools, VR is a good place to begin. When you use VR, you isolate yourself from the physical world and enter a simulated one. The technology has been around since the 1950s, and while it was first adopted by the gaming community, we’re now seeing the myriad applications of VR in practice. VR is changing major industries, such as tourism, real estate, health care and others.
Equipment
VR requires a headset along with hand controllers. The headset closes you off from the physical space. You can now walk and experience a world that is completely simulated or move around a real location recorded in 360-degree video. The controllers simulate hands in the virtual world.
In addition to a headset and hand controllers, there is additional equipment to further immerse you in the virtual space. Some equipment is universal while other pieces are specific to the application and are not always necessary.
Applications in Training
Training opportunities in VR are plentiful. Training programs put employees in situations that mimic their real environment make it easy for learners to practice skills safely, in a hands-on way. For instance:
- In customer service training, you can replicate sales floor conditions, complete with agitated customers and crowds eager to take advantage of a sale.
- Industries that are inherently dangerous, such as manufacturing, also use VR. Technicians working in manufacturing plants can practice high-risk procedures and processes in a simulated environment before executing them in the real one.
- Soft skills training in VR has also proven successful. Round up a remote workforce into one place or through simulations separately.
What Is AR?
Where VR immerses you in the digital world, AR is about presenting digital objects or information within the physical space. For instance, today, you’ll find AR technology in head-up displays of select cars..
Equipment
One of the most appealing aspects of using AR for training is how little equipment you need. AR primarily uses the built-in cameras in phones and tablets. Then, it’s just a matter of building (or adopting) a mobile learning app to support it, Compared to the list of equipment needed for VR, training with AR has a lower barrier to entry.
Applications in Training
AR works best for training focused on knowledge sharing. Because AR relies on certain triggers, it works like a teaching aid. In the context of manufacturing training, employees can place their screen over a part and trigger detailed information on what it is and how it is used within the manufacturing process.
Another application for AR in training involves getting around a large warehouse. If you see a product is located in a specific section, you can use your phone and see digital markers for each section of the warehouse. You can then go straight to where your product is located.Where VR immerses you in the digital world, AR is about presenting digital objects or information within the physical space.
What Is MR?
Compared to VR, MR is in its infancy. And based on its behavior, some people consider it to be an advanced version of AR — “AR 2.0,” so to speak.
What many people don’t realize is that MR can also be considered “VR 2.0.” MR combines the digital and physical worlds, like VR, but makes interaction between the two possible (an ability that VR doesn’t currently support).
- MR in the physical world: In MR, you navigate the physical world while using a headset or lens. A headset allows you to remain in the physical world but also show you simulated objects you can reach out to and interact with, as if they were actually there.
- MR in the virtual world: When you are in the simulated environment, objects in your real space (like a sofa or a chair) are interpreted by sensors and represented in your virtual space. For example, in a spy game, that chair in your dining room can become a statue for you to hide behind.
Equipment
When it comes to equipment, MR currently uses some of the most advanced headsets available. The Microsoft Hololens is among the leaders in this technology, but this will likely be a heavily contested market. Google and Apple are both expected to release MR headsets, and there are bound to be others hitting the market as well.
Applications in Training
MR can revolutionize in situ training (IST). Rather than take time to train before you get on the job, you can receive guidance as you work. Again, considering the manufacturing industry, MR allows learners to see and interact with 3D diagrams of the components they are building so they can test the model before assembling the real thing.
In medicine, MR has been used as a collaboration and training tool. Using MR, surgeons can share models and assist the surgeon in the room with the patient to minimize mistakes. Experienced doctors can assist novice ones using MR, as well as those with limited access to resources.
What Is XR?
The final abbreviation to discuss is XR, which stands for extended reality. This is an umbrella term used to cover the three technologies we’ve discussed above. While we currently separate them, it’s easy to imagine a future where human beings navigate between the three along with physical reality. In the future, we may very well live in a perpetual state of extended reality.
In Conclusion
The world now consists of several realities. you can immerse yourself in a completely virtual world or remain in the physical reality with a blend of digital objects. Now that you know the difference between VR, AR and MR, you are better prepared to determine which, if any, of these technologies can support training efforts in your organization.In the future, we may very well live in a perpetual state of extended reality.
Quelle:
The concept of a digital human in cyberspace, also known as an avatar, controlled by an operator wearing a virtual reality headset, was predicted a long time ago. It originated from science fiction books at the end of the 1960s and became especially popular at the dawn of personal computing during the 1980s and 1990s.
Thanks to the development of extended reality technologies, we are now approaching a new era of digital presence and immersive interactions in the metaverse. This has made the topic of human representation in cyberspace more important than ever before.
Use of virtual reality avatars is growing rapidly
Various companies, content creators, and evangelists in the extended reality industry are implementing early prototypes of avatars in their digital content with the goal of enabling better interactions between humans in the metaverse.
In a relatively short period of time, these metahuman avatars have entered different fields such as virtual production, AAA games, and VR blogging, and more and more other industries, too, are expected to start using them soon.
With the rise of applications such as VRChat, YouTube real-time VR blogging, or specialized multi-user design review applications, the demand for realistic avatars is greater than ever before. A great example of VR blogging using real-time metahuman avatars is the Xanadu YouTube show.
But what kind of characteristics and capabilities enable you to create a great, realistic-looking avatar?
Characteristics of a great virtual reality avatar
A great avatar is as realistic-looking as possible. The greatest benefit of realistic avatars is that they enable far deeper, more immersive VR experiences than the (currently) more widely used, cartoonish avatars.
Being able to portray small details such as the kinematics of emotions (how a person’s face moves when portraying a specific emotion), or imperfections that mimic a real person’s facial anatomy, create a feeling of another person’s presence that is very close to real life. This is something that the simplified, cartoonish characters can’t accomplish, even when used in a 3D environment.
The realistic-looking avatars provide a more personal, intimate, and hyper-realistic experience when having a virtual conversation or other interactions in the metaverse, whether you are using social applications such as VRChat, or conducting professional design reviews with a team from around the world.
It is important to remember that the journey of the virtual reality avatar industry is just beginning. It goes without saying that there are still a lot of technical limitations that will need to be resolved for virtual avatars to feel completely realistic, but some of the early developments do look extremely promising.

Practical example: Epic Games MetaHuman Creator and Unreal Engine 5
The Epic Games MetaHuman avatars, which are created using Unreal Engine 5, are a perfect example of proper implementation of a digital human concept. The high-level visual quality they provide is quite unlike anything we have seen before. The power of this technology was shown during the launch of Unreal Engine 5 in the Matrix Experience.
Epic Games has also provided a very intuitive way of creating MetaHumans via their cloud-based MetaHuman Creator. Using MetaHuman Creator via the cloud is as easy and user-friendly as creating your character in a computer game. The creation tool runs smoothly on a cloud in real-time and supports features such as raytracing that provides a high-fidelity visual representation of the created character. The character can then be exported into Unreal Engine 5 for further adjustments.
The MetaHuman Creator supports all the new features of Unreal Engine 5 such as control-rig, space-switching, pose-tool, facial pose IK retargeter, and IK Rig, allowing you to use custom motion capture animations and real-time face capture via a live-link plugin.
You can find more information about these features in Unreal Engine 5 documentation linked below.
- Control Rig in Unreal Engine | Unreal Engine 5.0 Documentation
- Re-parent Control Rig Controls in real-time in Unreal Engine. | Unreal Engine 5.0 Documentation
- Animation Editor Mode in Unreal Engine | Unreal Engine 5.0 Documentation
- MetaHumans Facial Pose Library Quick Start | MetaHuman Creator Documentation (unrealengine.com)
- IK Rig Animation Retargeting in Unreal Engine | Unreal Engine 5.0 Documentation
- IK Rig in Unreal Engine | Unreal Engine 5.0 Documentation
How Varjo enables deeper interactions in the metaverse with photorealistic avatars
Here at Varjo, we have developed our own VR/XR demo using the latest features of MetaHuman Creator. Thanks to Unreal Engine 5, it is possible to create extremely lifelike MetaHuman characters easily.
By using Varjo’s top-tier headset, Varjo XR-3, which features high-fidelity, colored, low-latency mixed reality video pass-through, we are able to bring the sensation of a virtual character’s presence in an actual room to the next level. With the XR-3, it is now possible to see all the small details of a virtual character such as cloth fabric, eyes, skin, and hair, and even experience facial animations that are captured in real-time via Unreal Engine 5’s live-link (plugin) capturing system. The virtual avatars can even be streamed directly from Varjo Reality Cloud.
This is an ongoing, work-in-progress experiment to see where avatar technology may lead us in the future. We are still actively developing and adding new features to the experiment. In the future, capabilities such as supporting a full-body motion tracking system may give us a unique opportunity to virtually embody the avatar and take full control over it – while everything is rendered in real-time in mixed reality, with human eye resolution provided by the Varjo XR-3 headset.
This marks the first step into the cyberspace of the metaverse in extremely high resolution using hyper-realistic MetaHuman avatars – exactly the way it was predicted in those cyberpunk sci-fi books of the past decades.
Creating digital twins of real-life people
Speaking of recent developments, it is worth mentioning that Epic Games has released a new, updated version of the MetaHuman plugin. The new version includes the Mesh to MetaHuman feature that allows you to upload a 3D-scanned mesh of a real face into MetaHuman Creator. This enables us to create more accurate and realistic digital twins than ever before.
This very feature was used in a magnificent way by researchers to reconstruct the face of a 10,000-year-old shaman.
The future of MetaHumans and photorealistic avatars
As you can see in the examples in this post, the metahuman avatar technology is developing rapidly. As people start taking their first steps in different metaverses, realistic and constantly improving avatars will be key elements in making these environments feel more real and full of meaningful interaction.
Even though the journey is just beginning, the future already looks bright for human interaction in the metaverse.
Quelle: