Google VPS (visual positioning system) walking directions went live yesterday under the name Live View. Residing in the Google Maps app for iOS and Android devices, it provides a 3D spatial interface for walking directions in urban areas.
Live View was already available on newer Pixel phones and for a small set of Local Guides who were essentially beta testing the product. Now it’s ready for full release with a rollout schedule over the next few weeks to ARCore and ARkit-compatible devices where Street View is active.
For those unfamiliar, Live View scans surroundings, which are then matched against Google’s Street View image database. Once a device is localized in this manner, spatially accurate directional signals are overlaid. It’s meant to be a more intuitive way to navigate city streets.
Google created Live View to address some of the pain points of that little blue dot in the overhead 2D interface of Google Maps. Those issues are most evident in urban areas where it can be off by blocks. So Google realized it had an ace up its sleeve to hack a clever solution: Street View.
The reason GPS falters in urban canyons is that satellite signals bounce off buildings, thus degrading the computation of spatial positioning. So along with inertial measurement, VPS can make that that localization a lot more precise through visual recognition from Street View.
That precise localization is necessary in Live View because overlaid directional arrows through your phone’s viewfinder need meter-level accuracy. The overhead Google Maps blue dot is often forgiven (begrudgingly) when it’s off by a half-block. But that won’t fly with a 3D visual interface.
Beyond the clever visual hack for device localization, the next challenge was interface design for a new form of mobile interaction that has no existing playbook. As Google presented at the I/O conference (video below) this required a rigorous design and prototyping process.
Moment of Truth
But the real testing will be in the wild. As a potentially high-frequency utility, VPS ticks all the boxes we’ve theorized for AR killer apps. But the moment of truth will come as mainstream consumers vote with their fingers about whether or not a 3D visual interface is something they want.
It will likely gain traction first with tech-forward city dwellers… which aligns with its urban use case. Google will accelerate this through incubation in Maps, just like Google Lens’ incubation in search. The good news: if adopted at scale, spatial UX comfort levels could extend to AR in general.
One notable lesson is that Live View is a close cousin of AR, but Google has been careful not to call it “AR” which is still too nascent and tech-sounding. Notice how Pokemon Go and Snapchat lenses (the most popular forms of AR to date) have done the similar. They hardly ever say “AR.”
Another wild card will be how/if Google monetizes the feature. As we’ve examined (video below), tech giants’ AR motivations usually trace back to their core businesses. Live View could be a natural fit for sponsored local discovery, but the first step is to test the waters for user behavior.
We’ll be watching closely for that and, like Google Lens, stress-testing the product in the wild.
Disclosure: AR Insider has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.