Affiliate links on Android Authority may earn us a commission.Learn more.
ARCore Depth API: How it will fundamentally transform your AR experiences
August 01, 2025
Today, Google is taking the wraps off its new ARCore Depth API. At first glance, this sounds highly technical and uninteresting. However, when you understand what it does, you’ll see how this will fundamentally change youraugmented realityexperiences.
You’ll also see how it will open up tons of new possibilities for AR in the worlds of productivity, shopping, and evengaming.

So what is the ARCore Depth API? Here’s Google’s official explanation:
Confused? It’s way easier to explain what it is by showing you what it does. Check out the GIFs below: on the left, you have an AR experience without the Depth API and, on the right, that same experience with it.

The ARCore Depth API allows the AR program to understand that the fluffy pillows in the room above are closer to you than the placement of the AR cat. Previously, ARCore wasn’t very good at determining this and would place the cat right on top of the pillows, creating a wholly unrealistic scene. With Depth API active, though, the cat’s body isbehindthe pillows and only the parts you would see in a real-world situation are visible.
Google explains in its blog post announcing Depth API how this works. It’s pretty technical, and you’re able to feel free to learn all about it by reading the post, but the image below gives you a solid idea. The Depth API uses your camera movements to determine which objects in your view are closer or further away, and then creates a depth map:

In the GIF, once the depth map is created, objects that are closer to you appear in red while objects that are far away appear in blue.
With the ARCore Depth API, AR apps will be much more realistic. When you use AR-powered shopping apps, for example, you may place household items in your home to get a sense of what they’d look like in your living room or on your counter. This new feature will make those experiences even more realistic, giving you more confidence about your purchase.

For gaming, a better sense of depth will allow you to do things such as hiding behind obstacles, accurately aiming projectiles, and getting a surprise when characters come out from behind structures. In the GIF at the top of this article, you can see an example of how this could work.
Related:Ten best augmented reality apps and AR apps for Android
The Depth API is not dependent on special cameras and sensors, so it should work on pretty muchany device that supports ARCore. However, devices with specialized cameras and time-of-flight (ToF) sensors will likely get a better and more accurate experience.
Google is hoping that developers will be excited to try out this new feature and integrate it into their AR-powered applications. It shouldn’t be too long before you start seeing better depth experiences in your current AR apps.
Thank you for being part of our community. Read ourComment Policybefore posting.