This is the latest in Localogy’s Skate To Where the Puck is Going series. Running semi-weekly, it examines the moves and motivations of tech giants as leading indicators for where markets are moving. Check out the entire series here, and its origin here


Last week, we unpacked the biggest takeaways from Snap’s developer conference — at least those that relate to local commerce. Inspired from that exercise, it’s now time to do the same for Apple’s WWDC developer conference keynote earlier this week. The local commerce implications are fewer but bigger.

First on the list is GeoAnchors in ARkit 4. As background for those unfamiliar, ARkit is Apple’s developer platform to build AR apps and experiences for iOS. GeoAchors draws out the inherently “local” aspects of AR by letting users plant spatially-anchored AR graphics for others to encounter.

As further background, AR doesn’t “just work.”  To overlay graphics on a real-world scene, devices first must understand that scene. That can happen through mapping the contours of the scene, having a previously mapped “point cloud” of the scene, and support from object recognition and AI.

So for many of AR’s local commerce use cases — such as informational overlays on storefronts — AR devices have to first identify the store. Google has gone the furthest down this path with Google Lens and Live View which utilizes imagery from Street View as a visual database for object recognition.

AR Cloud

That brings us back to Apple’s GeoAnchors. They’ll similarly tap into Apple’s Street View-like “Look Around” feature that it’s been developing over the past few years. GeoAnchors will use this data to “localize” a device, then reveal the right spatially-anchored AR graphics with millimeter precision.

Also similar to Google Lens, Apple will use a combination of data sources to localize an AR device. Image recognition is one source, using Look Around’s visual database as noted, while other data sources include where a device is (GPS), where it’s looking (compass), and how it’s moving (IMU).

This combination of inputs allows AR devices to perform in power-efficient ways. In other words, spatial mapping data for the inhabitable earth is a data-heavy payload. So an AR device — in this case an iPhone — can selectively access relevant pieces of the map, based on where it is per GPS signal.

This is known as the “AR Cloud” and is a key principle to the vision we all have about how AR is supposed to work. Most AR-oriented tech companies are building some version of an AR cloud that maps to their goals. For example, Snap is doing it through composites of past Snaps, as we examined.

As another example, Niantic launched a similar crowdsourced approach in which legions of roaming Pokémon Go players map the world as they play the game. Earlier versions of the game operate without this, but overlays are imprecise. That works fine for Pikachu… but not for local storefronts.

Long-Term Vision

Back to Apple, it has rolled out GeoAnchors in select cities — logically the same cities where Look Around is active. One thing Apple is doing here is planting the seeds for an AR cloud, and getting a head start on the developer acclimation to start to think spatially and build AR apps with this capability.

Meanwhile, a few other parallel Apple initiatives tie into the same longer-term vision. The first was previously revealed in iOS 14 code: codename Gobi. This is an effort to plant QR codes throughout retail partner locations to trigger AR information and discounts when activated through a camera.

This means Apple has dual-tracks to AR. The world-immersive “AR Cloud” activations outlined above represent one track, which is a more advanced form of AR. Gobi will meanwhile represent “marker-based” AR which is a more rudimentary form but simpler and more practical in certain contexts.

The second tie-in was announced at WWDC: App Clips. These atomize functions that have traditionally been housed in full-blown apps, making them more accessible on the fly. So real-world activities like pre-paying a parking meter will no longer require downloading the associated app.

Instead, small QR codes — likely the same ones that will be used with Gobi — let users scan to access the mini-app functionality to fill that parking meter… or other use cases that developers come up with. Downloading apps in such situations has traditionally stifled adoption for mobile transactions.

Halo Effect

Back to the theme of Apple “planting seeds,” these QR codes that develop from various retailers, brands and startups will start to populate physical spaces. Practical App-Clips uses will drive that adoption and retailer participation. But like Apple Maps’ Look Around, there will be an AR byproduct.

In other words, with QR codes populating stores, public spaces and points of interest, Apple will have an installed network of activation triggers for AR experiences. Planting the seeds in this way will allow time for that network to be built while AR adoption itself gradually grows, thus synced optimally.

Speaking of lead time and long-term thinking, the other wild card is Apple’s AR glasses which will piggyback on all of the above. Apple wants the infrastructure to be in place by the time it launches its glasses in the next few years. That should lessen some of the adoption friction they’ll surely face.

All of the above, plus more time for smartphone-based AR to acclimate culturally, could give AR glasses a fighting chance for consumer adoption. Amplifying that is Apple’s signature halo effect. If anyone can pull this off — including planting QR codes across the physical world — it’s probably Apple.

Click me

Related Resources

Gannett and Uberall Tighten Alignment

Earlier this week, Uberall was selected by the Gannett company to be their technology partner to provide presence management solutions. With 250 local newspapers and