Niantic Spatial SDK Brings Immersive Reality to Meta Quest 3 with the Passthrough Camera API

What We’re Announcing

Starting today, Niantic Spatial SDK v3.15 includes beta support for Meta Quest 3, bringing Visual Positioning System (VPS) and other core spatial computing capabilities to fully immersive mixed reality headsets.

This update rounds out our support for headworn devices—giving developers the flexibility to build real-time, immersive experiences across form factors. With access to Quest 3’s front-facing RGB cameras via the Passthrough Camera API, tools like live meshing, semantics, and object detection are now available through the same SDK used for mobile and AR.

Whether you're prototyping AR training, deploying indoor navigation, or building simulation tools, this opens new opportunities to develop enterprise, public sector, and consumer-grade applications that understand and respond to the physical world in real time.

Why We Prioritized This

As spatial computing continues to gain traction, developers are looking for robust tools that work seamlessly across the hardware their users already own.

Adding support for Quest 3 extends the Spatial SDK’s reach into MR, and makes it even easier to build once and deploy across devices.

“We’ve consistently heard from customers that seamless support for headworn devices is essential to scaling spatial computing,” said Baljeet Singh, VP of Product at Niantic Spatial. “Our collaboration with Meta gives developers access to rich perception capabilities—like meshing, semantics, and VPS—on a powerful, readily available headset.”

Ken Wolfe, Director of Engineering, emphasized the broader platform impact: “Bringing Spatial SDK support to Meta Quest 3 showcases the versatility of our platform and our commitment to meeting developers and customers where they are. It’s another step in delivering best-in-class mapping and AR capabilities across the devices and platforms that power real-world innovation.”

This milestone strengthens our vision of making the real world machine-readable and accessible through spatial AI, whether you're holding a phone, wearing glasses, or inside a headset.

What’s New in v3.15

Note: Each video below showcases live footage recorded through Meta Quest 3.

With this release, Quest 3 developers can access many of Niantic Spatial’s core capabilities, including:

  • Visual Positioning System (VPS): Anchors digital content with centimeter-level accuracy by recognizing real-world locations. Learn more.

  • On-device 3D mapping: Generates detailed spatial maps in real time without relying on cloud connectivity.

  • In-application live meshing: Builds dynamic mesh representations of physical environments directly within your app. In the video on the left, you’ll see near-field meshing in a living room–scale environment. On the right, we showcase long-distance meshing in a large open lobby—demonstrating real-time spatial understanding across vastly different physical scales.

  • Semantic segmentation: Identifies and labels objects and surfaces (e.g., walls, floors, furniture) to enhance spatial understanding.

  • Object detection: Recognizes and tracks specific objects in the camera view to trigger interactions or overlays.

  • Improved real-time occlusions: Ensures virtual content realistically appears behind or in front of real-world objects.

These features are built into Niantic Spatial SDK’s modular architecture, allowing for lightweight, flexible deployments across a range of use cases—from remote collaboration to large-scale installations and location-based experiences. With VPS and on-device mapping, developers can create seamless, co-located experiences across phones, Meta Quest 3, and Magic Leap 2.

Supported Enterprise Use Cases

  • Indoor Navigation: Wayfinding through large, GPS-denied spaces like convention centers or factories

  • Training & Simulation: AI-assisted instructions overlaid in real-time on physical environments

  • Field Collaboration: Remote users can interact and annotate the same physical space in real time

  • Site Mapping & Planning: Capture, analyze, and simulate environments using headworn devices

How It Works (For Developers)

To get started:

  1. Set up Niantic Spatial SDK v3.15 by following the steps in our documentation under the Meta Quest 3 tab.

  2. Access sample projects and integration guides in our documentation under the Meta Quest 3 tab to jumpstart your development and build cross-platform AR experiences.

  3. Test in real-world environments and share your feedback via support@nianticspatial.com or on our developer community forum.

What’s Coming Next

We’re continuing to expand support across additional headworn devices, improve performance, and introduce new features like enhanced occlusion and persistent scene understanding. Your feedback will help shape what comes next—let us know what’s working and where we can do better.