One of the biggest trends to emerge in the past half-decade is the creator economy. We’re talking everything from Etsy artists to YouTube stars to Twitch streamers to Instagram influencers. The trend has been fueled by a steady stream of features from all these players to enable media creation.
At the center of it all has been the rise of TikTok as a creative platform. It has brought a fresh format and discovery engine to explode past one billion global users and escalating revenues, now estimated around $5 billion. It’s also created a use case that’s all about high production-quality content.
To that end, one opportunity that’s always held considerable potential in its alignment with TikTok’s ethos is augmented reality. It has a few in-house lenses that can be used to dress up its signature videos, but nowhere near the scale that’s been achieved by Snap and Meta (Facebook and Instagram).
There’s one chief reason for the difference in scale achieved by these players: open platforms. Snap has grown to 6 billion daily active lens views (roughly equivalent to the number of humans on the planet, per day), by scaling lens development and creative energy through its Lens Studio platform.
This has allowed it to not only boost engagement to the point of attracting brand advertisers, but also pursue long-tail opportunities for local commerce. For example, its Landmarkers and Local lenses let local businesses digitally enhance their storefronts, among other things – a sort of real-world metaverse.
Outward-Facing
Back to TikTok, all of the above is meant to set the stage for its (frankly under-exposed) move this week: an open AR lens creation platform. Known as Effect House, it’s the counterpoint to Snap’s Lens Studio and Meta’s Spark AR – both open platforms for interactive lenses, including the sponsored variety.
Also, like these existing platforms, Effect House will require some degree of technical abilities, though much of the UX is object-oriented (drag & drop). It likewise launches with ample documentation, templates, tutorials, and a “knowledge lab” to guide creators in the process of building AR effects.
Effect House also offers functions that provide a baseline for users to build on and add their own creative twists. These include segmentation, face mask, head tracker, face stretch, and 3D face. There are also elements like textures, materials, lighting, and shadows that creators can run with.
The result is that we could see some novel AR directions and formats emerge that are based on TikTok’s unique properties and dynamics. For example, will TikTok’s signature “Duets” inspire a new AR use case where effects morph and cascade through several progressive remixes?
And TikTok’s rear-facing camera orientation (as opposed to the selfie lenses endemic to Snap and Meta), could make it a natural fit for the outward-facing local discovery lenses noted above. It will be up to the TikTok creator community to come up with those, but they now have the tools to do so.
Beyond the Physical State
That brings us back to the concept of a “real-world metaverse.” This is all about adding digital dimension to the physical world. Rather than online/virtual worlds that are the more-discussed metaverse modality, greater value and meaning could reside with geo-anchored data that animates the physical world.
This less-discussed flavor of the metaverse – like its online multiplayer counterpart – is years from actualization. But many of the pieces are in place, including location data from the Foursquares of the world. There are also tools like Waze and Snap that digitally enhance real-world activities.
Broadening the definition of “metaverse” in these ways gets us to some practical and valuable real-life outcomes. It’s a departure from the runaway hype cycle that’s underway as the internet continues to lose its collective mind over the prospect of an online/virtual multiplayer metaverse.
In fact, a “real-world metaverse,” as characterized above, is truer to the M-word itself. The greek root meta means “beyond,” which can describe digital dimension and meaning that go beyond objects’ physical states. Could this engender a new visually-oriented local search use case for Main Street?
The answer is maybe. And like many other emerging tech, the outcome will likely take longer and be a different version of what we envision today. But now is the time to start talking about it and getting smart on the underlying dynamics. We’ll get the chance to do so at Localogy Place in the Fall.