7 min read

Standards for Augmented Reality and the Spatial Cloud: A Long Wait?

Standards for Augmented Reality and the Spatial Cloud: A Long Wait?

Matt Miesnieks, former co-founder and CEO of 6D.ai, thinks it will take time for the spatial computing industry to adopt the standards being pursued by the Open AR Cloud Association and the more open ecosystem providers like Google..

“We used to joke that standards are for the losers,” said Matt. “Generally someone would win the market and all the other second, third and fourth place vendors in the market would get ticked off and upset that someone was winning and then would get together and create a standard.”

Matt, one of the leaders who has driven adoption of spatial computing, made the remarks last night during a roundtable to promote The Infinite Retina, a new book by Irena Cronin and Robert Scoble:

“I saw this play out in mobile..and what we saw in mobile was that what drove the economics of applications…was getting to that ‘click’,” said Matt. “So that ended up driving native apps…so you ended up with closed app store ecosystems.”

“I imagine it’s going to be quite a challenge for companies that are very much invested in an open ecosystem, like Google for instance, to figure out how to get the user experience that’s good enough on a cross-platform/open ecosystem for developers to make more money than they can get off the closed ecosystem.”

What Are AR Cloud Standards?

The OpenAR Cloud Association is setting out to create standards for 3D spatial mapping and positioning and to provide guidelines for privacy and security.

They focus on a few key pillars in particular:

  • The methods used to scan and the schema used to map physical reality. Spatial mapping is a three-dimensional ‘scan’ of reality which then identifies and maps the semantic relationships within that scan. Some call this a ‘digital twin’ of physical space.
  • The methods used to position within that space. I might walk into a physical location that has a digital twin, but we need methods to determine exactly where I am in relation to the space and the spatial anchors within it.

In Theory, We Need Standards to Manage Transitions

Creating standards seems important.

Imagine that the standards for the outdoor map you use in your self-driving car are different from the standards used in the parking lot at your office which are different from those used to map the inside of your building.

You want to move seamlessly from space to space. You don’t want your AR glasses to lose track of where you are. Otherwise that wayfinding overlay you’re using could easily get lost as it moves from space to space.

But Matt’s point is that there’s no motivation to create open and shared standards. If you own a pair of Facebook AR Glasses it will be Facebook that determines what the standards are in order to see these different spaces.

There’s too much benefit, at least in the short term, in developing methods for creating and positioning within digital twins.

Apple will want ITS glasses to work beautifully. And there will be competitive advantages in keeping the code and methods for the “Apple Way” as closed as possible.

Everyone will be fighting for a commercial edge.

Physical Locations Could Accelerate Standards

But there’s another ‘player’ in augmented reality: the physical world. And because of it, the ‘rules’ that apply to mobile or software development may not hold.

Physical locations want a commercial edge also. They could end up being a key driver and have an important voice.

We often talk about rights and ownership in AR as being related to the DIGITAL rights. Software developers care about software. They want to see their 3D dinosaur (enough with the dinosaurs!) roaming your local mall.

And in that rush towards wonder, it’s easy for them to leave out the rights of places and things in the physical world.

The owner of your local mall might feel quite the opposite. Which is one reason Pokemon Go was careful, for example, to avoid battles in someone’s backyard.

The owners of physical, real-world “property” could provide a counter-pressure to the AR industry which leads more quickly to the adoption of standards.

Starbucks and the AR Cloud

Let’s take the example of Starbucks.

9to5Mac reports that Apple may be getting ready to launch an AR experience which will include Starbucks as a partner:

“Apple is developing a new app as part of its work on iOS 14. The new app, codenamed Gobi, will allow users to get more information about the world around them by using an augmented reality experience on the phone. The AR experience would also be part of Apple’s forthcoming AR headset project.

Based on 9to5Mac code findings, Apple appears to be testing integrations with Apple Stores and Starbucks.”

Now, there’s nothing about this integration that necessarily suggests Starbucks will be using a spatial cloud. But let’s assume that they are. It would mean that:

  • In order to provide a smooth user experience, Starbucks will spatially scan and map its locations
  • These maps would include spatial anchors
  • These anchors would allow AR content to appear in a precise location. For example, food information floating above the food display, or your Starbucks rewards information floating above the cash register

In this scenario, Starbucks would be providing highly detailed and rich information about its stores, probably with an understanding by Apple that this information will only be used within the confines of, say, Apple Maps.

But what happens when Starbucks goes to do the same thing with Google? Or Facebook? Or decides to do a new in-store experience with Niantic, the makers of Pokemon Go (and who recently acquired Matt’s company)?

As the number of players creating their own ‘AR Clouds’ grows, physical locations will have two growing interests:

  • Preventing duplication, where they need to adopt the scans of their spaces to different platforms
  • A desire to protect their rights at their own locations. It’s one thing for Starbucks to partner with Apple. It’s another thing for Facebook to scan Starbucks stores via crowd-sourced data from Facebook glasses and to use that data without Starbuck’s permission

Sharing Rights In Exchange for Standards

I believe we’re headed for a clash. The AR industry seems strangely silent on the ‘rights’ of the physical world. If it can be digitized it will be, and who are you to prevent me from morphing your retail store into a scene from an AR version of Fortnite?

But property owners, I believe, will have other ideas. The owners of the billboards in Times Square will come to realize that just because a million people walk through the space, AR glasses will eventually mean that many of them might not even be SEEING their billboards: they’ve been replaced by digital overlays.

Soon, you’ll be seeing guys with weird helmets walking around your local mall. They’re like Google Streetview cars at a human/indoor level. And they were recently launched by ReScan.

Rescan allows large physical locations to create spatial scans of their spaces in record time. Image: ReScan

Your local mall owner will do these scans because they want to use the ‘streams’ to help them manage their locations.

But it won’t be long before Apple, Google, Facebook and Niantic are knocking at their door, pitching the mall owner on using those scans to create different experiences on the different AR platforms: from Apple and Facebook glasses, to Google Maps and an indoor game by Niantic.

The owners of physical spaces (and, in time, the spatial maps they’ve created) have some power in the equation. That power could result in a pressure to adopt industry standards.

A Registry for Spatial Maps (And Why Money Matters)

So Matt might be right: it usually takes one player to dominate an industry segment before the smaller players band together to adopt standards. But physical space itself might be that dominant player.

Or it could be with the right incentive structures in place.

Imagine that Starbucks can post versions of its in-store scans to a spatial registry. It might not post a high-resolution version for competitive reasons, but it could post a few key spatial anchors.

Along with this scan, it can tag a few things:

  • Security and safety anchors, indicating what things in its space shouldn’t be obscured or digitally ‘edited’ for safety or security reasons
  • “Do not disturb” anchors, which Starbucks wants to reserve for its own content (for example, the space around the cash register)
  • “Available for rent” anchors, which Starbucks is willing to allow use of for commercial purposes (and payment)
  • And “open anchors” which can be used on demand

I proposed that these types of tags could act as a sort of Creative Commons for physical objects.

Now you have two commercial interests who might be able to work for their mutual benefit: the AR ecosystems, who want to access as many high-resolution scans as possible. And the physical spaces, who want to protect their rights and get a slice of the commercial upside.

And as part of this shared registry, shared standards can emerge. Starbucks can say: “hey I’m not going to upload 10 different semantic mappings of my space. I am participating in this registry on the understanding that it adheres to certain standards. You might want to adopt those standards to your own platform, but that’s your business not mine.”

A Fork In The Road

There’s a business here. Someone can launch a shared registry and monetize being the ‘middle man’ between brands, properties and AR ecosystems. Throw in the word blockchain and you’re off and running.

Or the Open AR Cloud Association could take the lead.

But if we don’t move towards systems that create an economic voice to physical spaces and ‘object owners’, then two things could happen:

  • The friction created by requiring physical spaces to have different outputs for different platforms will leave out a lot of smaller players on both the physical and digital side of the AR equation. Smaller malls or city governments, for example, might place only one or two bets (Google and Apple, say) and leave out a lot of the smaller more innovative players.
  • Or, scans will end up being created of physical spaces without the owner’s permission or participation, leading to eventual clashes between the ‘old world’ and the new world of extended reality

Matt is right. Once we start to see a dominant player emerge there will be a rush of collaboration and innovation to try to bypass the closed ecosystems.

And yet that rush could happen sooner if the physical world stakeholders wake up to the value of having a voice (and economic incentive) in a world in which everything has become digital.