Snapchat Gets More Location Aware

Snap has historically intersected with the Localogyverse in a few places. This occurs as it develops location-relevant experiences for users and, correspondingly, location-based marketing opportunities for businesses. Examples so far include Snap Map, Geofilters, Landmarkers, and Local Lenses.

Its most recent move crossed our desks today: location-aware lenses. Specifically, lenses can now interact with GPS and GNSS (Global Navigation Satellite Systems). They can also ingest location-based signals such as compass headings and custom locations, which flow from Snap’s places database.

The idea is to let lens developers build experiences that are location-driven. This can be everything from walking tours to restaurant discovery. For example, given that lenses are all about 3D graphical elements that overlay the field of view, they can do things like reveal nearby happy hours or retail sales.

Snapchat Launches Footsteps, a Location Log for Snap Map

Show Rather than Tell

To show rather than tell, Snap has showcased a few lenses that make good use of the new capabilities. For instance, the NavigatAR lens from Utopia Labs offers users glowing directional arrows to guide them to where they’re going. Utilizing the geospatial data in Snap Map, this is sort of like Google Live View.

Speaking of which, these types of immersive and dimensional wayfinding experiences have popped up occasionally over the past decade. They’re always compelling and novel when launched and demoed, but have failed to gain meaningful traction among everyday users who are content with 2D mapping.

One reason for this lack of adoption scale is because 3D navigation and wayfinding isn’t always the most elegant UX on a smartphone. Holding up one’s phone to see directional arrows can cause arm fatigue, and there’s a certain vanity factor among users in the activity’s physical awkwardness.

But Snap is well aware of this, which is why these location-aware updates are angled towards its headworn form factor: Spectacles. In fact, this week’s announcement was coupled with Snap’s commemoration of the six-month anniversary of the device’s launch.

L24: SNAP through a Local Lens

Planting Seeds

In that light, Snap is thinking ahead to a day when a line-of-sight and ambient orientation will make more sense for utilities such as 3D mapping. And that’s what it’s going for here. But Snap is also careful to stay self-aware that smart-glasses ubiquity is still a ways off. That’s why Spectacles are for developers.

Correspondingly, Snap’s goal is to start planting seeds to accelerate smart glasses adoption cycles over the next decade. It knows that there’s a classic chicken & egg dilemma. Users won’t adopt emerging form factors that have no content, while content creators aren’t incentivized to platforms with no reach.

So Snap wants to start to stimulate a content marketplace, and do so by starting on the developer end. And that’s what Spectacles are all about. So we’ll continue to see early-stage experimentation to feel out what killer apps will look like. And just like in the early days of the iPhone, that will be a process.

Along the way, we’ll see gaming and social experiences. But there will also be utilities such as local discovery and visual search. The latter is already a popular function in Ray-Ban Meta Smartglasses, so demand signals are already evident. This will be a long journey, but a compelling one to watch.

Share Article...

Follow Us...

Stay ahead of the curve and get the latest on Local straight to your inbox.

By submitting this form, you agree to receive communications from Localogy. You can unsubscribe at any time.

Related Resources