In a few weeks, Localogy will hold its next conference, Localogy Place: The Real World Metaverse. Deviating from common metaverse connotations – involving online synchronous 3D worlds – it’s all about how digital content and data are adding depth and dimension to the places & spaces around us.
In that spirit, we’re gearing up for the show by immersing ourselves in the topic, excuse the pun. So for the latest Video Vault post, we’re spotlighting Niantic’s efforts to build a real-world metaverse. This follows last week’s Video Vault post in which we examined Google’s parallel efforts in this area.
As for Niantic, We’ve examined these tools in the past, but the company recently took them to the next level. At its Lightship VPS event, it spun out its geo-local AR capabilities – the same engine behind Pokemon Go – as an SDK. This takes its underlying geospatial architecture and opens it up for everyone.
Check out the session video below, preceded by our summary notes and strategic takeaways. Hope to see you in 3 weeks in NYC, where we’ll feature a fireside chat with Niantic on this development…
Visual Positioning System
With that backdrop, Niantic recently pushed things forward in the geo-local AR world with its Lightship platform launch. Previously announced, it will empower developers to build Pokemon Go-like geospatial AR apps. It takes the game’s architecture and spatial mapping chops and wraps it up in a platform.
With the full name Lightship VPS, it builds on the concept of a visual positioning system. Rather than GPS’ satellite data, it uses visual cues in the world around us to “localize” a given device. Once that device knows where it is and what it’s looking at, it can infuse the right digital content.
Niantic isn’t the only one developing this principle. As we examined last week, Google’s Geolocal API does something similar by localizing devices using Street View imagery. Object recognition from its Street View database can inform a device where it is and what it’s looking at, thus enabling 3D overlays.
That gives Google a considerable edge in developing a VPS-based product. So how will Niantic gain that level of visual data to feed into its VPS system? The answer is its users. For a few years, it’s been crowdsourcing the development of “spatial maps” as Pokémon Go players roam the earth.
And by launching Lightship, it hopes to scale up its efforts and gain more comprehensive spatial maps through several geospatial AR apps. This works towards what Niantic calls “planet-scale AR.” It’s an ambitious undertaking that attempts to boil the ocean, but with enormously valuable potential.

Federated Framework
When you combine the efforts of Google and Niantic, there could be robust collective spatial maps that can fuel real-world metaverse apps. Add the detailed spatial maps collected by autonomous vehicles (via LiDAR) and the opportunity for the real-world metaverse’s infrastructure starts to come together.
But though more comprehensive spatial maps can be gained from federating these systems, the above players are incentivized to build walled gardens… just like the dynamics that ruled 2D mapping for the past several years. Could we see a sort of Open Street Map for spatial mapping data emerge?
Meanwhile, another principle jumps out from Niantic’s latest move: a B2B shift. Its expansion from a consumer-facing game to a B2B data provider is reminiscent of Foursquare’s pivot. And like with Foursquare, this is a more sustainable business model as consumer app darlings tend to fade over time.
Another analogy for Niantic’s B2B play is Amazon Web Services. Amazon developed it because of its own escalating and variable compute needs. Then it realized it had something valuable on its hands and spun it out as its own product. We’ll see if Niantic achieves some degree of the same with Lightship.
See the full Lightship launch keynote below…

 
													

