Google Taps Users to Advance Local 3D Mapping

The next battleground in local mapping could be in the third dimension. On a basic level, that includes the longstanding development of street-level data and imagery, a la Street View. Looking forward, that data is being parlayed and further developed into interactive experiences for local search & discovery.

As we’ve examined, this includes things like Google Lens and Live View. The former is a sort of cousin of augmented reality that lets you point your phone at various real-world objects (including storefronts) to get informational overlays. The latter offers turn-by-turn mobile navigation in urban areas via 3D graphics.

Both tools utilize data that Google already has from years of assembling Street View imagery. Given advancements in the smartphone camera and machine learning, Google can match what the camera sees with its Street View image database to “localize” a device. After that key step, Lens and Live View can do their thing.

With that baseline, Google has rolled out updates over the past several months. Most recently, that includes storefront business details that appear when users are navigating with LiveView. It also rolled out a more prominent launch button in the transit tab in Google Maps, sharing functionality, and support for landmarks.

Google Upgrades Live View AR Navigation, Part II

Wisdom of the Crowds

To continue this string of feature rollouts and underlying functionality, Google announced last week that it will beef up its underlying 3D data by crowdsourcing it. Starting with the Street View app on Android, a new “connected photos” feature lets users contribute local photos to strengthen the Google Maps image database.

This means that for the first time, a 360 degree camera isn’t needed to capture Street View imagery. The quality won’t be as good as images from Street View cars, but it will scale better. The same privacy controls will be in place for these crowdsourced images, such as blurring faces and license plates.

As for the intended purpose for this crowdsourced imagery, it’s split between explicit and speculated. The explicit purposes include getting “last mile” image data in remote areas or in places like hiking paths. The speculated purposes include getting more 3D image data for Google’s AR and local visual search ambitions.

Pursuant to the latter, Connected Photos lets users capture several images by walking down a given street while the app is running on an upheld smartphone. This option gives Google more frames to work with, and will feed into a more comprehensive “mesh” of 3D image data. This advances its broader visual-search efforts.

As background, crowdsourced collections of 3D mapping data is an effort known as the “AR cloud.” Niantic is doing something similar by capturing real-world spatial maps by Pokemon Go players. Having this 3D image data will be the first step towards spatially-anchored AR experiences that have location relevance.

The user-facing outcomes for these efforts will be things like graphically-rich local discovery experiences. Hold your phone up to learn more about a restaurant, or to see the reviews left by your friends. These will be more advanced versions of Lens and Live View, and will advance further through Google’s parallel AR efforts.

Bigger Picture

Those parallel efforts include Google’s Earth Cloud Anchors. This feature of its ARCore platform lets users geo-anchor digital content for others to view. That includes notes to friends, digital graffiti, or practical business info. Along with Lens and Live View, this could represent another UGC component of local visual search.

These disparate visual search updates are notable on their own but have deeper implications when viewed together. They also collectively trace back to Google’s goal to future proof its core search business. A visual interface isn’t widely popular now for local search, but Google wants to skate to where the puck is going.

Apple is likewise gaining ground gradually in mapping. That includes 3D navigation, per its Quick Look that now performs a similar device-localization trick by holding your phone up to scan nearby buildings. It’s part of Apple’s broader mapping reboot, and could represent a key battleground in local’s next era.

Meanwhile, the new Connected Photos feature is limited to ARCore-compatible Android devices, and only within select regions, but Google will announce wider availability at an unspecified timetable. We’ll keep watching it closely to track the evolution of local discovery as it develops more of a visual front end.

This is the latest in Localogy’s Skate To Where the Puck is Going series. Running semi-weekly, it examines the moves and motivations of tech giants as leading indicators for where markets are moving.

Stay ahead of the curve and get the latest on Local straight to your inbox.

By submitting this form, you agree to receive communications from Localogy. You can unsubscribe at any time.

Related Resources

Are 3D Cities the Next Horizon in Local Search?

3D mapping was accelerated last week with a new developer tool from Here Technologies. It provides 3D mapping data for 75 cities for developers to overlay various data, such as places. Will this jumpstart new local search apps?