The LiDAR scanner built into some mobile devices was a peculiar kind of innovation. In 2020, Apple presented it as a crowning feature of iPads and iPhones, only to be ignored by most mainstream users, even if applauded by the pro-consumer crowd.
What is the LiDAR scanner? A revolutionary advancement or just another example of snazzy technology that has very little relevance for most users?
Unlike Apple’s outsized enthusiasm about the usual iPad and iPhone improvements we see each year—incremental enhancements to make the gadgets faster, lighter, bigger—the company’s excitement about the LiDAR feature may have merit.
LiDAR expands the capabilities of the device in several ways. The most advertised improvement is in mobile photography. But since mobile photography is enabled by a myriad of technologies working together to create the final image, the benefits of using LiDAR may be outweighed by improvements in other areas, especially in computational photography. Instead, closer attention should be paid to the capabilities that LiDAR brings to augmented reality (AR) and 3D scanning in both industrial and pro-consumer applications.
The surface-scanning abilities of LiDAR tangibly improve the AR experience. Apple already offers superior AR; on iOS, AR is smoother, snappier, and more precise than it is on Android-based devices. The difference is especially pronounced when one compares the iPhone to the devices of its chief rival, Samsung. Despite Samsung’s similar or even superior hardware specs, Samsung devices produce sluggish AR visuals that are marred by freezing and skipping frames. The situation improves when using clean Android versions, like those installed on Google Pixel devices. Here too, however, the difference in AR quality between iOS-based AR and Android-based AR is still highly noticeable, and in more than one area.
Side-by-side AR experience
No-light spatial tracking capabilities of iOS and Android devices
LiDAR-supported surface scanning is thus helping iOS to increase its lead over Android. There is no question that Android’s camera-based ability to detect surfaces is a technological marvel. But it cannot match the accuracy, stability, and performance of devices with built-in LiDAR. Using AR on Android devices causes regular false positives, leading to abrupt visual changes and unpredictable results that dismay users.
Another function improved by LiDAR is 3D scanning. Adding LiDAR to iPad and iPhone has resulted in several new families of apps, including space-scanning and asset-detection apps, that would not have been possible in devices that rely only on optical capabilities. And the vast majority of Android devices rely only on optical capabilities.
Gains are difficult to counter
It is difficult to emulate or counter the advantages of LiDAR-equipped iPhones and iPads. Nothing on Google’s AR roadmap suggests that equally compelling features are coming anytime soon. Fragmentary, home-made solutions offered by third-party vendors to address the inadequacies of the Android platform also fall short of Apple’s comprehensive approach.
Moreover, according to Digitimes, Apple has signed an exclusive deal with Sony to provide LiDAR sensors for Apple devices until 2023, thereby apparently securing for itself a lock on the most effective LiDAR sensors.
Other manufacturers simply lack access to the same quality of sensors. Although the time-of-flight (ToF) sensors used by flagship Samsung and Huawei phones, and by a handful of other phones, do address some of the challenges—for example, by providing much more accurate surface detection—they certainly do not play in the same league as Apple’s LiDAR.
The current ToF technology is not on a par with LiDAR. Samsung is reportedly working on an ISOCELL Vizion 33D sensor. But so far there is no firm release date. The Galaxy S21 family of devices does not include the sensor. Even in the best-case scenario, the Vizion 33D sensor will be added to mainstream Samsung devices only in 2022. Given Samsung’s patchy commitment to LiDAR and ToF technology, this means that Apple has at least two years of lead time to further improve and its own integration of LiDAR technology.
Lead times mean market share
At the moment, usage of LiDAR is somewhat invisible: it’s limited mostly to professional and pro-consumer applications. The year 2020 was one of experimentation. Startups and in-house development teams worked on proofs of concept and beta releases to identify and test use cases. Things will be different in 2021.
In 2021, companies have begun to produce apps and prototypes that were experimental projects in 2020. The technology will now graduate from the stage of being technology for enthusiasts to the stage of being technology for early adopters.
The mark being made by such applications is still very small, and it is easy to dismiss that impact as irrelevant to the bigger picture of the competition between one platform and another or one device and another. But I would argue that what has been accomplished so far has great implications for the future.
2021 and beyond
In 2021, we expect a wave of case studies demonstrating the tangible benefits of LiDAR technology. As organizations proceed with their hardware cycles, they will be inclined to replace outgoing devices with LiDAR-enabled iPhones and iPads in order to utilize the new functionality that is missing from Android.
The average lifespan of the enterprise smartphone is two and a half years, and the lagging capabilities of the Android operating system will result in more organizational standardizing around iOS.
Missing the boat
Mobile technology is multifaceted. Reasons for organizations to adopt one or another platform are determined by many factors, not simply by the utility of a single amazing app that a system can or cannot run.
The Apple platform has many disadvantages, ranging from the authoritarian approach to its ecosystem to the inadequate support from third-party hardware manufacturers. Yet Apple continues to offer unique and compelling functionality that outweighs the disadvantages of standardizing around iOS.
The early successes of LiDAR and the growing interest in using it in industrial settings suggest that the Samsung and Android ecosystems have missed the boat. The feeble attempt to include a semi-useful ToF sensor in new models has led, in Samsung’s words, to a lack of “killer contents.” This deficiency has in turn driven Samsung, Google, and others to misread the direction of the market.
In the Android community, the initial thinking about LiDAR and ToF may have focused primarily on the biggest market: the consumer and mainly consumer photography. In this context, we have seen that there is often more than one way to achieve the same goal. As Google Pixel devices have demonstrated, advances in computational photography can outpace the performance of better cameras and extra sensors. Smart computing of AR visuals can compensate for weaker hardware. Yet, in some areas, there are no miracle solutions that can compensate for inadequate hardware. By taking advanced LiDAR capabilities mainstream, it looks as if Apple has positioned itself to dominate both the enterprise market and the pro-consumer market for years to come.