BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Adobe ‘Sneak’ Showcases Collaborative AR Demo At Adobe MAX 2020

Following
This article is more than 3 years old.

An exclusive look at “Project AR Together,” Adobe’s experiment with real-time collaboration in augmented reality.

Yesterday, Adobe announced new features coming to its Aero AR platform during the company’s largest annual conference, Adobe MAX, including deeper animation and interaction functionality, as well as the announcement of a public beta for its desktop app. As part of today’s “Sneaks” program at MAX, Adobe has teased an early-stage vision of what will be a crucial component for the development of augmented reality in both creative and practical arenas: real-time collaboration.

Every year, the “Sneaks” program at MAX offers a glimpse of early-stage features and innovations that its researchers and engineers are working on. While any of these Sneaks might end up in a future product, these more experimental offerings are designed to share the company’s vision, concepts, and research.

MORE FROM FORBESAugmented Reality At MAX 2020: Adobe Announces Aero 2.0 With Public Beta For Desktop App

The Sneak, called “Project AR Together,” was produced by Haoliang Wang, a research scientist at Adobe. While Wang’s “typical” responsibilities involve making Adobe’s Creative Cloud more robust and efficient, “AR Together” began as an intern’s project and ultimately evolved into a sort of “side gig” for Wang. He saw incredible potential in leveraging his experience optimizing cloud functionality to enhance AR experiences in the wake of Covid-19 shelter-in-place orders worldwide. Collaboration in AR is of course important for making the medium useful to the general public in any context, but the need for it had suddenly ballooned.

“Given the time we are in, there’s really nowhere we can go, but the current AR experience is a single-person experience—if you want to collaborate with others or make it a social experience, it’s not very easy to do,” Wang said in an interview with the author. “One of the reasons that the multiuser AR experience is challenging is because of the inconsistent tracking of the physical space across multiple devices.”

In the video, Wang and his wife experiment with home decoration options. Those familiar with augmented reality will immediately note the precision of the tracking across both phones as the two play with options including a vase and coffee mug on the table, as well as a sticky note on the wall.

Accessing the same AR experience across different devices will be key to making AR a truly social, interactive medium. While we can see the obvious implications for gaming, having AR that retains ultra-precise tracking across devices will make AR feel like a more intuitive option for commonplace activities like home decorating.

“For the scenario we showed in the demo video, previously we had to download an app, and if we wanted to decorate the empty apartment [with AR], we had to pass the phone back and forth ... so it’s a bit of an awkward experience,” Wang said. “I thought, ‘Everyone has a phone. Why couldn’t we share the AR experience?’ And then we discovered the challenges and started to build on top of that.”

The challenge, simply put, is helping multiple devices understand the exact position and orientation 0f digital object in the real world in real time. Even slight discrepancies can mean, for instance, that one viewer sees a virtual lamp perched on the bedside table where it belongs, and another sees it floating uncannily above the table. And this challenge gets even tougher when you add in real-time interaction for all users.

Wang addressed this problem using a machine learning-based sensor fusion method, in which data from different sensors on the smartphone were correlated to produce predictions of the surrounding environment.

“Basically we have multiple sensors on the phone, and multiple measurements from those sensors,” Wang said. “Combining them together makes a more accurate prediction over time. It’s like what self-driving cars use, LiDAR, cameras, ultrasonic sensors—they use them to get a full-sense of the environment. It’s the same idea here.”

He explained that “AR Together” relies on information from ultra-wideband (UWB) technology, which is currently only available in the iPhone 11 series (and up) and Samsung Galaxy Note 20 Ultra. So even though the technology actually works in this case, there are inherent limitations with early glimpses like this when it comes to readiness for mainstream adoption.

“This is still very early-stage and experimental, it’s still a bit early to say whether we’ll roll it out or not,” Wang said.

Still, this Sneak is a powerful, straightforward vision for how new technologies like AR will enter into the average person’s life, particularly as 5G enable exponential increases in bandwidth and download speeds.

While sensor fusion might not be the first thing that comes to mind when we imagine the cutting edge of augmented reality, it’s a piece of the puzzle that grants a level of immersion that will be requisite for adoption across a variety of industry verticals. Whether for shared creative experiences, device tutorials, or home decorating, multiuser collaboration is a crucial aspect AR will need to offer to be truly useful; it’s a tipping point that is likely to give way to unprecedented use cases, especially user-generated content. “Project AR Together” is an early signal that this functionality could be here sooner than we think.

Follow me on Twitter or LinkedInCheck out my website or some of my other work here