Will Snap Bring Visual Search to Local Commerce?

One emerging technology that continues to have implications for local commerce is visual search. This positions the camera as a search input. It does this by applying machine learning and computer vision to help identify items you point your phone at. Think of it as a close cousin of Augmented Reality.

Seen in products such as Google Lens, the “search what you see” use case is all about annotating the world around you. Pointing rather than tapping is the appeal, as the former can be more intuitive in some cases, such as identifying style items you encounter in the real world. And Gen-Z is very camera-forward.

Beyond Google — a logical competitor — Snap is intent on visual search. Its Snap Scan product is well-positioned given the company’s AR competency. As noted, visual search is a form of AR… but it sort of flips the script. In other words, it expands from Selfie lenses to more utilitarian and commerce-oriented use cases.

It also literally flips things around as it moves from the front-facing camera (selfie fodder) to the rear-facing camera. This lets Snap augment the broader canvas of the physical world — whether that be Local Lenses or annotations through Snap Scan. The latter is where utilitarian and commerce-oriented use cases lie.

Snap Continues Steady March Towards Local Search

Primary Evolutions

All of the above took a step forward with Snap’s most recent Scan updates. Its primary evolutions involve the ability to recognize more objects and products. Scanning a QR code is one thing… but being able to recognize physical world objects like pets, flowers and clothes requires more machine learning.

So far, Snap has built up the requisite data and training sets through partners. For example, its partnership with Photomath, gives Snap Scan the ability to literally solve math problems. By pointing one’s phone at a math problem on a physical page, it can solve the problem — a valuable use case for kids doing homework.

Its newest partnership with Allrecipes will let users scan food items to get recipes for dishes they could make with that combination of ingredients. The new Screenshop feature will meanwhile make Scan more shoppable, including information from retail partners on where to buy style items that users scan.

Beyond identifying a wider range of physical-world items, Snap Scan is now more accessible and front & center in the Snapchat UX. It’s now right on the main camera screen, which should help it to gain more traction and condition new user habits. Because visual search is still new and unproven, it needs that nudge.

To further that consumer education and give Scan some training wheels, Snap is baking in more suggested ways to use it. This includes the Camera Shortcuts feature that suggests features and functions whenever a Snap is taken within Snapchat. This is meant to inspire and educate users on new Scan-based use cases.

YouTube player
Launchpoint

Speaking of use cases, this is what will drive visual search, as the emerging tool needs a “killer app.” One use case that could fit the bill is fashion. Snap wants Scan to be a launch point for shopping by letting users scan clothes they see for “outfit inspiration” — a behavior that’s natively aligned with visual search.

As for other use cases that will make sense, visual search will add value when items have unclear branding or pricing. For example, wine has a decentralized product landscape where pricing isn’t always transparent. Beyond pricing, visual search can unearth things like reviews and where to buy a given product.

The latter is where the rubber meets the road in both the use case and the business model. Visual search could be naturally monetizable in carrying some of the same user intent that makes web search so lucrative. The lean-forward activity of scanning a product can clearly carry a high degree of buying intent.

In that sense, visual search is a natural extension for tech companies built on ad revenue models. And the two companies noted in this article — Google and Snap — are exactly that. Adding to the list, Pinterest is intent on visual search given Pinterest Lens, which fits right in with Pinterest’s product-discovery persona.

As for local commerce, this could all come together in promotions for physical stores that carry scanned items. And since Snap is building a local places database and a local advertising business, it could further enable (and provide financial motivation for) Snap Scan to become a launchpoint for local commerce.

Of course, most of the above is our speculation rather than Snap’s explicit intent. We’ll keep watching and reporting back as Snap’s many local commerce moving parts converge.

Share Article...

Follow Us...

Stay ahead of the curve and get the latest on Local straight to your inbox.

By submitting this form, you agree to receive communications from Localogy. You can unsubscribe at any time.

Related Resources

U.S. Ad Spend : Good News & Bad News Localogy

U.S. Ad Spend: Good News & Bad News

During a time of economic uncertainty and retracted brand spending – when ad budgets are normally first to be slashed – ad spending is in high gear. Or at least it was. IAB’s 2024 U.S. advertising spend “actuals” are out, and they paint a positive picture… with some caveats.