This is the latest in Localogy’s Skate To Where the Puck is Going series. Running semi-weekly, it examines the moves and motivations of tech giants as leading indicators for where markets are moving. Check out the entire series here, and its origin here.
Just weeks after announcing upgrades to its Live View AR navigation feature, Google has unveiled more enhancements. Before jumping into the updates, what is Live View? As we’ve examined, the 3D mapping feature provides an AR interface for urban walking, including big floating directional arrows.
The idea is that looking down at a 2D map, then mentally mapping that information to 3D space, is a “cognitive load” that can be avoided. Though it’s a classic first-world problem, a common pain point is exiting a subway station (pre-Covid) and not knowing what street corner you’re on.
There’s also the common pain point of imprecise positioning in any urban canyon. Because GPS signals bounce off buildings, it throws off the calculation of how far you are from the satellite. That’s why the little blue dot is often misplaced by as much as a full city block when you’re trying to navigate downtown.
Live View helps “localize” your device in these scenarios by recognizing buildings. It’s devised a clever hack to resolve GPS shortcomings by tapping into a different sensor: your camera. Utilizing its database of Street View imagery, it applies object recognition to find out where you’re standing, then guides you in 3D.
Front & Center
With that backdrop, what are Live View’s latest updates? The most prominent new feature is the ability to get details for businesses that users encounter on their navigated route. By pointing your phone at a given storefront, basic business details are overlaid, with the ability to expand for more.
Google has been teasing this feature for a while including mocked-up “concept demos” at past Google I/O conferences. Now the feature is finally seeing the light of day. This means that the coming weeks and months will be telling in terms of a live market experiment for consumers’ interest in visual-search.
Though it could expand into many potential use cases for local discovery, one explicit use case is seeing store hours or how busy a given location is at the moment, in the interest of social distancing. This data comes from Google’s busyness feature which has existed in GMB and local search results for a while.
Speaking of which, the data that enables this AR experience flows from GMB. This positions Google uniquely to deliver a 3D visual-search tool, as its execution will rely on reliable local data (otherwise known as the AR Cloud). Live View also taps into its computer vision chops from Google Lens.
In fact, the latest Live View updates merge navigation and commercial intent — a formula at the heart of Google Maps. Google is drawing upon that playbook and applying it to a visual interface. Given Gen-Z’s affinity for the camera and AR, this can be seen as a sort of future-proofing hedge for local search.
Panning back, there are parallel efforts in Google’s AR playbook that continue to develop. For example, it recently launched Earth Cloud Anchors. This feature of its ARCore platform lets users geo-anchor digital content for others to view. That includes notes to friends, digital graffiti, or practical business info.
Tying that back to Live View and Lens updates, Cloud Anchors could be a sort of user-generated-content component of local visual search. Though that will start with fun and whimsical use cases (like Snap is doing with Local Lenses), a local business ratings & reviews use case could likewise develop.
First, we’ll have to see if a visual front end resonates with consumers, as noted. Holding your phone up to navigate or identify businesses is intuitive in some ways (and easier than typing), but consumer habits are hard to break. If Google can pull this off, visual search could be another touchpoint for local discovery.