This week’s AR news has been focused on, many of which lean on the depth-scanning hardware only on the recent iPad Pro. Google announced its own AR news this week, too, and you won’t need specialized hardware to use its depth-sensing tools.
A Depth API update to ARCore, announced and available today, will be able to make 3D meshes of environments and use them to lay virtual objects down more realistically. Virtual objects will even appear to hide behind real ones through a technique called occlusion.
Google first announced this feature last December, but it’s just becoming available now. Google has already announced a number of apps that will use the depth functions in AR, including an update to a key Samsung app. The Quick Measure AR app in the and will gain extra depth-improvement updates for measurements “in the coming months,” according to Google. A demo reel of what the depth-sensing AR can do is embedded below; a lot of it looks impressive.
A few games and apps already use some of these features, including Five Nights at Freddy’s AR: Special Delivery. Snapchat has integrated them into two AR lenses: Dancing Hotdog (which is a dancing hotdog) and Undersea World, which turns your space into an aquarium full of fish. (Snapchat’s getting depth support for other Lens developers now on Android, with this update: You can expectfrom lenses in future.)
Many companies are exploring tools for camera-based world-scanning, including Pokemon Go creator Niantic, which boughtearlier this year.
The ARCore update comes with a Google ARCore Depth Lab app, which will allow for some experiments, too.
It’s unclear how precise Google’s Depth API is over other tools., much like Google’s now-discontinued line of , can create a physical depth map with tangible measurements. But the future of computer vision will undoubtedly move to enabling more world scanning with less hardware. Android devices with time-of-flight sensors, which also measure depth, will get faster and better depth results, according to Google.