Apple and Google continue to wage a feature arms race in mapping. Here, Google is the incumbent with a dominant market share, while Apple is motivated to catch up with rapid-fire feature rollouts. Apple also has an advantaged position, at least on iPhones, and is investing considerably to gain ground on Google.
One segment of this competition for mapping market share deals with emerging tech that’s increasingly infused in the mapping UX. This includes immersive 3D navigation and wayfinding, as well as the related ability to visually identify places along those routes (think: storefronts and business details).
Back to relative positioning and advantages, Google can offer these immersive features because of its vast street view database. It can “localize” a given device by matching what the device sees with Street View imagery, similar to ways that autonomous vehicles use LiDAR to “see” the road.
Once localized, the routing software can then do its thing and overlay wayfinding arrows to get you to your destination. This has become useful in contexts such as urban walking directions, orienting oneself when emerging from an underground transit station, or being guided to your airport gate.
Look Around
As for Apple, it doesn’t have Street View but it is starting to catch up with a competitive data source. Its Look Around feature is similar to Street View and is a result of an effort by Apple to rebuild its mapping capabilities and first-party data. This includes sending Street View-like cars & cameras out on the roads.
This is all part of the “investing considerably” part noted above. But even with an Apple-sized cash balance, it has a long way to catch up to Google, given the longer tenure and deeper data in Street View. So how will it catch up? One way just came to light: Collecting data while people use the Apple Maps.
In other words, as users hold up their phones for 3D wayfinding, the camera ingests data to improve Apple’s 3D maps. This is what’s known as a point cloud. Similar to, again, LiDAR (which some iPhones have), it scans a given urban landscape to develop machine-readable 3D maps of building contours.
And if anyone can pull this off, it’s Apple. Scanning every street corner in America – or at least the most traveled ones – requires immense scale. Apple will look to accomplish that through the combination of its fleet of cars and vans, as well as the armies of iPhone users simply trying to get to where they’re going.
Camera Native
This new scanning and data-ingestion strategy is rolling out first with iOS 17.2. And because it’s the privacy-first Apple, the company is quick to point out that it’s only collecting machine-readable data such as 3D point clouds of buildings… versus human-readable photos, such as faces and license plates.
Meanwhile, we’ll see if Apple can pull this off and catch up to Google in the mapping arms race. It may have an opportunity to future-proof itself with such emerging technologies. If it can leapfrog Google in some of these areas, it might convert some users – especially among the camera-native GenZ.


