Google ‘Multisearch Near Me’ Blends Local & Visual

Among the procession of announcements Google made in its I/O conference keynote yesterday, one stuck out to us: Mutlisearch. This is an ongoing area of development for Google that reached the next level yesterday. In short, it involves blending various inputs (text, voice, visuals) as search queries.

In some ways, this is an extension of Google’s longer-term evolution towards various content formats in search results. Past manifestations of this broader principle include the 2010s’ universal search and its ongoing knowledge graph developments that feature multimedia-driven SERPs and knowledge panels.

But multisearch flips the script. Rather than using text queries to unearth a variety of multimedia search results, we’re talking here about using various formats as search inputs. This can involve starting a product/image search, then narrowing down results with text (e.g., “the same jacket in blue”).

If we just look at the visual elements of multisearch, another way to think about this is having CTRL+F for the physical world. You can use all the imagery around you in the physical world to launch searches for similar objects. This will be a topic central to Localogy’s upcoming Place Conference.

This is also the idea behind visual search, a la Google Lens. Point your phone at objects to identify them. Here Google taps into its knowledge graph and image database as a training set for AI object recognition. And the rubber meets the road with monetizable (read: shoppable) visual searches.

What’s Driving Google’s Interest in Visual Search & Navigation?

CTRL+F

This all leads up to yesterday’s announcement which puts a local spin on all the above. If multisearch and “near me” searches had a baby, that’s essentially what was announced. Known as “Multisearch Near Me,” it lets users search using images or screenshots along with the text “near me” to see local results.

By local results, we mean businesses. So if you see a slick jean jacket on the street or an enticing dish on Instagram, snap a pic and search in the above manner to see nearby retailers or restaurants that offer similar fare. That series of steps may be an adoption impediment but it’s just the beginning.

For example, all the above will evolve in a few directions. For one, Multisearch Near Me will expand in the future beyond single items to entire scenes. In other words, users will be able to choose multiple objects within an image or frame to launch searches for related items or see contextual overlays.

For the latter, this could all converge at some point with something we’ve already noted above: Google Lens. The beauty of Google Lens is that visual searches happen in real-time as you pan your phone around. Search activation dots pop up on your screen where you can launch a visual search.

This real-time approach offers a more seamless UX, as opposed to screenshots and image selection. In other words, when Multisearch Near Me comes to Google Lens’ – with the possible handoff to Google Live View for last-mile navigation – it will be a more elegant way to find things to do, see and eat locally.

Google Upgrades Live View AR Navigation, Part II

Captions for the Real World

While we’re speculating, the longer-term convergence for all the above involves hands-free visual search. In other words, AR glasses… which Google also teased yesterday. One adoption barrier to Google Lens and Live View currently involves arm fatigue and social awkwardness of holding up your phone.

Of course, AR glasses come with their own dose of social awkwardness, as seen in the Google Glass era. A few things may be different now for Google’s prospective AR glasses: underlying tech and use cases. The former has advanced closer to the realm of socially-acceptable, wayfarer-like, smart glasses.

Meanwhile, the use case (why should I wear these things?) is an important lesson Google learned from Glass. That’s why its glasses are now positioned for a more practical, sober, and focused use case: real-time language translation (captions for the real world). And Google can pull it off given Google Translate.

But while it zeroed in on that use case for the sake of simplicity and focus (again… lessons learned), other utility-based use cases could be integrated later. And that brings us back to local search:  Multisearch Near Me could reach new levels of utility if it’s in your direct line of sight… and private.

Again, much of this is speculative, but the dots are all there. Of course, take all of this with the appropriate salt tonnage given the fickleness of style and culture when it comes to things that go on your face. This is a product category where good tech isn’t the full story. We’ll be watching (excuse the pun).

YouTube player

Share Article...

Follow Us...

Stay ahead of the curve and get the latest on Local straight to your inbox.

By submitting this form, you agree to receive communications from Localogy. You can unsubscribe at any time.

Related Resources

Big Tech’s Regulatory Reckoning

What is the European Union’s Digital Marketing Act? And why does it appear that Apple, Alphabet (e.g., Google), and Meta (e.g., Facebook, Instagram) are about

What is Cloud Hosting? Bluehost Gets Into the Game

Earlier this month Bluehost launched Bluehost Cloud, bringing its customers into the world of cloud hosting – advanced form of web hosting that utilizes distributed servers for security, performance, and load balancing.