Augmented and mixed reality (AR/MR) tools are no longer science fiction. They’re here, in our hands (or on our faces), changing how we work. In the manufacturing and product design space, I’ve seen firsthand how these technologies are creating new ways of prototyping, collaborating, and even performing quality assurance. But like any transformation, the road is far from smooth.
I’ve seen both the excitement and hesitation AR still inspires. So when someone reached out to ask how AR is really evolving in the field, I thought it was time to take a step back and reflect on where we are, what’s working, and where this might all be headed.
Here’s a breakdown of what I’ve observed—across hardware, software, adoption challenges, and the near future of spatial collaboration in design and manufacturing.
The hardware: powerful, promising, and still awkward
Right now, the most usable AR and mixed reality (MR) devices are passthrough headsets like the Meta Quest, Apple Vision Pro, and upcoming Android-based options. These devices offer solid spatial computing capabilities, blending digital elements into real-world space with enough stability for collaborative work.
But there’s a catch.
They’re big. Bulky. Heavy. And while they offer mixed reality capabilities, they were built with VR at their core, often running full operating systems onboard. That means users get power and flexibility but at the cost of comfort and simplicity.
As a result, adoption is still limited. Designers and engineers may love the idea of using these devices, but doing so for extended periods? That’s where the hesitation starts.
It’s clear the industry is aiming for see-through, glasses-style devices, i.e., hardware that offers the same capabilities without the helmet-like form factor. But the tech isn’t quite there yet. Over the next five years, passthrough headsets will likely remain the most capable and practical devices for industrial MR applications.
The software: XR is leading, AR is catching up
While hardware inches forward, software has been moving faster, especially on the VR and MR side of the spectrum. 3D design and modeling tools, like Gravity Sketch, Shapr3D and others, have gained traction among process specialists and engineers who want to sketch in 3D space, view their work at scale, and iterate without the friction of switching contexts.
One of the strongest use cases I’ve seen is multi-user collaboration in XR. Remote teams—sometimes spread across continents—can meet in a shared digital space, review models together, and exchange feedback in real-time. The immersive environment really does change how people think and communicate spatial ideas.
What’s more, not everyone in those sessions needs a headset. With a 2D viewer or participant mode, teams can bring in product managers, engineers, and even execs—no need for everyone to suit up. It’s a hybrid workflow that mirrors how most teams work now.
AR, by contrast, still has physical limitations. It’s typically limited to same-location collaboration (everyone in the same room with spatial anchors), which makes it a bit less useful for early-stage concepting. That said, I’ve seen plenty of design managers use AR mode to view final models in context—and it’s a powerful way to validate decisions before any physical prototyping happens.
So why do some teams still hesitate?
Despite their clear potential, AR and MR still face a few universal obstacles, especially among teams just getting started.
1. Comfort and usability
A lot of headsets are still uncomfortable to wear for more than 20–30 minutes. That alone can be a dealbreaker for folks who haven’t fully bought in.
2. Device friction
Most casual users treat headsets like gaming consoles: they’re tucked away and used occasionally. This means every session starts with a software update, a dead battery, or both. For regular use in business workflows, that friction is fatal.
The key to adoption is minimizing setup time. A headset should always charged and up to date, because the minute there’s a blocker, you lose your team’s buy-in.
3. Social and physical isolation
Headsets are immersive by design. But that also makes them socially isolating and disorienting for some users. Motion sickness, physical disconnection, and the “weirdness factor” are all real. Adoption only sticks when the value clearly outweighs the discomfort.
Where AR delivers real value today
The most successful use cases I’ve seen involve contextual visualization—placing digital objects in physical space to better understand scale, placement, and relationships.
A great example: showing a designed object exactly where it will be used. In shipbuilding, AEC, and manufacturing, teams overlay 3D models on real environments to check for clashes, alignment, or issues.
That’s a game-changer for quality control and design validation—especially when physical prototyping is expensive or impractical.
For global teams, though, there’s a catch: getting everyone in the same room isn’t always possible. One workaround I’m seeing more often is scanning real environments using Gaussian splats or photogrammetry, and bringing remote collaborators into that space via VR. That hybrid approach—real-world context plus virtual presence—is showing real promise.
Why some AR tools shift back to VR
Here’s something I’ve noticed a lot: companies often start with AR, but shift toward VR over time.
Take AVEVA, for example. Their team built AR-based collaboration tools for design review. But over time, they found placing a model on a physical table was more limiting than helpful. People bumped into each other, lighting caused issues, and tracking wasn’t always reliable.
Eventually, they migrated the workflow to VR—where users had more freedom to interact, and the experience was more consistent. That flexibility really mattered.
What’s next: scanning, streaming, and spatial presence
I think the real breakthrough will come when devices can scan and stream environments in real time.
Imagine being on a factory floor and inviting a teammate to “beam in” from the other side of the world—walking around with you, inspecting parts, pointing out issues in real space. That’s the kind of experience we’re heading toward.
As SLAM (simultaneous localization and mapping) tech improves, and networking catches up, we’ll see a wave of new tools that make remote spatial collaboration feel as natural as a video call—but way more powerful.
Final thoughts: AR isn’t a gimmick. It just needs to fit
Here’s what I keep coming back to: AR and MR aren’t silver bullets. They’re not always the right tool for the job. But when they do fit—especially in manufacturing, prototyping, and design validation—they can be transformative.
If you’re considering AR in your workflows, ask yourself: What’s something we can’t do effectively on a screen?
If AR makes that easier, faster, or more accurate—it’s probably worth it.
The headsets may not be perfect. But the need for better spatial communication is very real. And finally, the tools are starting to catch up.
Quelle: