Skip to main content
API Reference

NSDK


Type Aliases

NameTypeSummary
NetworkRequestId = ARDK_NetworkRequestIdARDK_NetworkRequestId
-
NSDKCameraExtrinsics = simd_float4x4simd_float4x4
-
NSDKCameraIntrinsics = NSDKFrameData.CameraIntrinsicsNSDKFrameData.CameraIntrinsics
-
NSDKHandle = ARDK_HandleARDK_Handle
A type alias for the native NSDK handle used to interface with the underlying C API.
NsdkSessionDataSource = NSDKSessionDataSourceNSDKSessionDataSource
Type alias for API compatibility.
NSDKVpsAnchorId = StringString
-

Classes

NameTypeSummary
AssetResultAssetResult
Contains all the AssetInfo objects returned by a query to the Sites Manager service.
AwarenessImageResultAwarenessImageResult
Image-based awareness result containing an NSDKImage.
AwarenessResultAwarenessResult
Base class for awarness results such as depth and segmentation.
Provides common properties like frame ID, timestamp, camera pose, and intrinsics.
BundlePlaybackDatasetLoaderBundlePlaybackDatasetLoader
A loader that retrieves playback dataset data from the app bundle.
This is the default implementation for loading datasets from Bundle.main.
Frame images and depth data are loaded on-demand when requested.
DataResourceOwnerDataResourceOwner
-
final DefaultSessionDataSourceDefaultSessionDataSource
Default iOS implementation of NSDKSessionDataSource backed by
ARSession and CLLocationManager.
DepthResultDepthResult
Contains depth estimation results from the NSDK depth processing system.
## Overview
Depth results are generated by the depth processing system and include:
- Disparity maps for depth estimation. Unlike direct depth maps that provide distance
values in meters, disparity maps contain pixel offset values that represent relative
depth differences. These values must be converted to actual depth using camera
intrinsic parameters and baseline information.
- Camera pose and orientation information
- Camera intrinsic parameters for coordinate transformations
- Error status and metadata
MeshDataMeshData
Contains 3D mesh data for rendering and visualization.
MeshData provides access to 3D mesh geometry including vertices, indices,
normals, and texture coordinates.
## Overview
MeshData includes:
- Vertices: 3D position data for mesh geometry
- Indices: The triangles that make up the mesh
- Normals: Surface normal vectors for the vertices, commonly used for lighting calculations
(only available for live meshing)
- UVs: Texture coordinates for the vertices, used for mapping textures to the mesh
(only available for mesh downloader)
## Example Usage
let (status, meshData) = meshDownloader.downloadMesh(meshId: meshId)
if status.isOk() {
print("Mesh vertices: \(meshData.vertices.count)")
print("Mesh triangles: \(meshData.indices.count / 3)")
// Convert to SceneKit geometry for rendering
if let geometry = meshData.toSCNGeometry() {
let node = SCNNode(geometry: geometry)
sceneView.scene.rootNode.addChildNode(node)
}
}
## Memory Management
Mesh data is backed by native memory that is automatically managed.
The data remains valid as long as the MeshData instance exists.
final MeshDownloaderResultsMeshDownloaderResults
Contains the downloaded mesh geometry data for a VPS location.
This object holds an array of mesh results, where each result includes mesh geometry,
texture data (if requested), and the transform matrix that positions the mesh in world space.
final NSDKCameraNSDKCamera
Single camera API for both modes. Holds either ARCamera (live) or PlaybackCamera (playback)
and exposes transform, viewMatrix, projectionMatrix, viewportRect, etc.
Relation to Apple AR: Wraps Apple's ARCamera in live mode; wraps our PlaybackCamera in playback.
Callers use NSDKCamera and don't branch. Create via NSDKCamera(arCamera:) or NSDKCamera(playbackCamera:).
final NSDKDepthSessionNSDKDepthSession
Depth feature session for NSDK with Combine publisher support.
Upon starting the depth session, NSDK will begin processing AR frames to generate depth data.
The latest depth data can be retrieved using latestDepth(), and latestImageParams()
provides information to synchronize the depth image with camera frame.
$result is refreshed automatically each frame by NSDKSession.update() while the session
is active. Subscribe to it with Combine:
depthSession.$result
.compactMap { if case .success(let result) = $0 { return result } else { return nil } }
.sink { result in ... }
.store(in: &cancellables)
final NSDKDeviceMappingSessionNSDKDeviceMappingSession
A session for creating VPS maps from AR data on the local device, with Combine publisher support.
The device mapping feature provides capabilities for locally building persistent maps that
can be used for Visual Positioning System (VPS) localization. These maps capture the visual
features and spatial structure of an environment.
$latestMapUpdate is refreshed automatically each frame by NSDKSession.update() while the
session is active. Subscribe to it with Combine:
mappingSession.$latestMapUpdate
.compactMap { $0 }
.sink { mapUpdate in ... }
.store(in: &cancellables)
final NSDKMapStorageNSDKMapStorage
A storage system for managing device-generated maps.
The map storage feature provides capabilities for capturing, storing, and managing
map data from AR sessions. This data can be persisted and used for Visual Positioning
System (VPS) localization and map updates.
final NSDKMeshDownloaderNSDKMeshDownloader
A session-scoped utility for downloading mesh geometry associated with VPS locations.
final NSDKMeshingSessionNSDKMeshingSession
A session for real-time 3D mesh generation from AR camera frames, with Combine publisher support.
The meshing feature provides capabilities for processing AR session data and generating
a triangle mesh representation of the physical environment in real-time. The mesh is
divided into chunks that are individually tracked as they are inserted, updated, or removed.
meshUpdates emits each frame a non-empty batch of chunk changes is available, driven by
NSDKSession.update() while the session is active. Subscribe to it with Combine:
meshingSession.meshUpdates
.sink { updates in ... }
.store(in: &cancellables)
final NSDKRecordingExporterNSDKRecordingExporter
A session for exporting scan recordings to various formats.
NSDKRecordingExporter provides capabilities for converting saved scan data
into recorderV2 format for use in Unity Playback or activating VPS.
final NSDKScanningSessionNSDKScanningSession
A session for 3D scanning and visualization with Combine publisher support.
The scanning feature provides capabilities for capturing, processing, and exporting
3D scan data from AR sessions. Scans of a location can be processed by the Visual
Positioning System's (VPS's) cloud services to enable VPS localization.
## Overview
Use $latestRaycastBuffer and $latestVoxelBuffer to reactively update your UI or
visualization in response to new scan data, rather than polling each frame manually.
## Usage Pattern
let scanSession = nsdkSession.acquireScanningSession()
let config = NSDKScanningSession.Configuration(
enableRaycastVisualization: true,
enableVoxelVisualization: true
)
try scanSession.configure(with: config)
scanSession.start()
// Subscribe to raycast buffer updates
scanSession.$latestRaycastBuffer
.compactMap { $0 }
.sink { buffer in
// Update raycast visualization
}
.store(in: &cancellables)
// Trigger voxel computation and subscribe to results
scanSession.computeVoxels()
scanSession.$latestVoxelBuffer
.compactMap { $0 }
.sink { buffer in
// Update voxel visualization
}
.store(in: &cancellables)
$latestRaycastBuffer is updated automatically each frame by NSDKSession.update() while
the session is active and raycast visualization is enabled. $latestVoxelBuffer is updated
each frame when new voxel data is available after computeVoxels() has been called.
final NSDKSceneSegmentationSessionNSDKSceneSegmentationSession
A session for semantic segmentation and environmental understanding with Combine publisher support.
NSDKSceneSegmentationSession provides capabilities for understanding the semantic structure
of the environment by classifying pixels into different object categories. This enables
applications to make intelligent decisions based on environmental context.
## Overview
Scene segmentation features include:
- Real-time semantic segmentation of camera images
- Multiple semantic categories (sky, ground, buildings, people, etc.)
- Confidence maps for semantic classifications
- Packed channel data for efficient processing
- Suppression masks for filtering unwanted areas
## Usage Pattern
// Acquire and configure the scene segmentation session
let sceneSegmentationSession = nsdkSession.acquireSceneSegmentationSession()
let config = NSDKSceneSegmentationSession.Configuration()
try sceneSegmentationSession.configure(with: config)
sceneSegmentationSession.start()
// Set the channel to observe and subscribe to results
sceneSegmentationSession.confidenceChannel = .person
sceneSegmentationSession.$confidenceResult
.compactMap { if case .success(let result) = $0 { return result } else { return nil } }
.sink { result in
// Process semantic confidence data
}
.store(in: &cancellables)
$confidenceResult, $packedChannels, $suppressionMask, and $imageParams are all
refreshed automatically each frame by NSDKSession.update() while the session is active.
final NSDKSessionNSDKSession
The main entry point for the NSDK (Native SDK) framework.
NSDKSession provides the core functionality for AR applications, managing the lifecycle
of NSDK features and serving as a factory for specialized sessions like VPS2, scanning, and mapping.
This class handles frame data processing, configuration management, and resource cleanup.
## Overview
Use NSDKSession to:
- Initialize the NSDK with auth tokens or a configuration file
- Send camera frame data for processing
- Create specialized feature sessions (VPS2, Scanning, Mapping)
- Query required input data formats
- Manage the lifecycle of NSDK resources
## Example Usage
// Initialize with tokens
let session = NSDKSession(accessToken: "access-token", refreshToken: "refresh-token")
// Create a VPS2 session for localization
let vps2Session = session.createVps2Session()
// Send frame data during AR session
let status = session.sendFrame(frameData)
final NSDKSitesSessionNSDKSitesSession
-
@MainActor NSDKViewNSDKView
Single view for both live and playback. Subclasses ARView; holds either an ARSession (live)
or PlaybackSession (playback) and exposes sessionMode, getCamera(), setDelegate(), setup().
Relation to Apple AR: Uses ARView; assigns either Apple's ARSession or our PlaybackSession to
self.session so the view always has a "session." The app talks to NSDKView so it doesn't have to
branch on live vs. playback for view, projection, viewport, or delegate callbacks.
final NSDKVps2SessionNSDKVps2Session
A session for VPS2 (Visual Positioning System) localization with Combine publisher support.
NSDKVps2Session provides capabilities for localizing the device in the real world using
VPS maps, universal localization, and anchor tracking. Anchors can be created at specific
poses and tracked across sessions using payloads.
## Overview
VPS2 features include:
- Real-time localization updates for converting between AR space and geolocation
- Anchor tracking with per-anchor pose updates
- Payload-based anchor persistence across sessions
- Localization request diagnostics
## Usage Pattern
// Acquire and configure the VPS2 session
let vps2Session = nsdkSession.acquireVps2Session()
let config = NSDKVps2Session.Configuration()
try vps2Session.configure(with: config)
// Subscribe to localization updates
vps2Session.$latestLocalization
.compactMap { $0 }
.sink { localization in
// Use localization.trackingState to check localization quality
}
.store(in: &cancellables)
// Subscribe to anchor updates
vps2Session.anchorUpdated
.sink { id, update in
// Handle per-anchor pose update
}
.store(in: &cancellables)
vps2Session.start()
// Track an anchor by payload
let anchorId = try vps2Session.trackAnchor(payload: base64Payload)
$latestLocalization is updated automatically each frame by NSDKSession.update() while
the session is active. anchorUpdated, createdAnchorPayload, and localizationRequestRecords
are fired via PassthroughSubject during update() when new data is available.
OrganizationResultOrganizationResult
Contains all the OrganizationInfo objects returned by a query to the Sites Manager service.
@MainActor PlaybackBackgroundRendererPlaybackBackgroundRenderer
-
final PlaybackCameraPlaybackCamera
Camera representation built from frame metadata (pose4x4, intrinsics, resolution). Exposes the same
concepts as ARCamera: transform, viewMatrix, projectionMatrix, viewportRect, displayOrientedTransform.
Relation to Apple AR: Stands in for ARCamera during playback. No Apple camera; we synthesize
view/projection from recorded intrinsics and pose. Use PlaybackFrame.camera to obtain an instance.
PlaybackDatasetPlaybackDataset
A dataset loaded from a capture JSON file containing frame metadata.
This class uses on-demand loading for frame images and depth data.
Only the currently requested frame is loaded into memory, reducing memory pressure
for large datasets.
PlaybackDatasetLoaderPlaybackDatasetLoader
Base class for loading playback dataset data from various sources.
This class provides a base implementation that must be subclassed. Subclasses must override
loadCaptureJSON(), loadImage(imageName:), loadDepthData(depthFileName:), and
loadDepthConfidence(confidenceFileName:) to provide concrete implementations.
The loader uses on-demand loading - only the capture JSON is loaded upfront, and frame
images/depth data are loaded when requested by PlaybackDataset.
- Note: This class acts as an abstract base class. Do not instantiate directly.
@MainActor PlaybackRendererPlaybackRenderer
Renders each playback frame: (1) draws the recorded camera image as the background, (2) moves the
RealityKit camera
(pose + FOV) to match the playback frame. PlaybackSession holds an optional
reference and calls renderFrame(_:) on the main queue for every new frame.
Relation to Apple AR: Manipulates RealityKit—sets arView.environment.background = .color(.clear),
adds a PerspectiveCamera and AnchorEntity, and updates their transform and fieldOfViewInDegrees
every frame from PlaybackCamera. In playback, ARView's default camera is off; we drive the virtual camera.
PlaybackSessionPlaybackSession
Drives playback. Subclasses ARSession so it can be assigned to ARView.session and the app can use
the same delegate pattern. Runs a loop on a background queue, builds PlaybackFrame per frame from
the PlaybackDataset, dispatches to the main queue, and notifies the delegate and PlaybackRenderer.
Relation to Apple AR: Replaces the behavior of ARSession (no real device frames); API-compatible so
ARView and delegate code don't need to know it's playback.
final PlaybackSessionDataSourcePlaybackSessionDataSource
-
RaycastBufferRaycastBuffer
A read-only container for the raycast buffer information generated during scanning.
SceneSegmentationResultSceneSegmentationResult
Contains semantic segmentation results from the NSDK scene segmentation processing system.
SceneSegmentationResult provides semantic understanding of the environment by classifying
pixels in camera images into different object categories (e.g., sky, ground, buildings,
people, vehicles). This enables applications to understand the scene structure and
make intelligent decisions based on environmental context.
## Overview
Semantic segmentation results include:
- Confidence maps: Per-pixel confidence scores for semantic classifications
- Packed channels: Multiple semantic categories encoded in a single image
- Suppression masks: Masks indicating areas to be ignored or suppressed
- Metadata: Frame information, timestamps, and error status
## Example Usage
// Get confidence for a specific semantic channel
let (status, confidenceResult) = sceneSegmentationSession.getLatestConfidence(channelIndex: 0)
if status.isOk(), let result = confidenceResult {
print("Confidence image size: \(result.image?.width ?? 0) x \(result.image?.height ?? 0)")
print("Frame ID: \(result.frameId)")
print("Timestamp: \(result.timestampMs)")

// Process confidence data for semantic understanding
processSemanticConfidence(result)
}
// Get packed semantic channels
let (status, packedResult) = sceneSegmentationSession.getLatestPackedChannels()
if status.isOk(), let result = packedResult {
// Process packed semantic data
processPackedSceneSegmentation(result)
}
SiteAssetsResultSiteAssetsResult
Contains the SiteAssetsInfo objects returned by a location-based sites query.
SiteResultSiteResult
Contains all the SiteInfo objects returned by a query to the Sites Manager service.
SitesResultSitesResult
Base class for results returned by queries to the Sites Manager service.
UserResultUserResult
Contains the user information returned by a query to the Sites Manager service.
VoxelBufferVoxelBuffer
A read-only container for the voxel buffer information generated during scanning.
final Vps2HeadingVps2Heading
A CLHeading subclass representing heading data computed by VPS2.
VPS2 derives heading from visual-inertial localization rather than a physical magnetometer,
so the raw magnetometer component values (x, y, z) are zero and not meaningful.
final Vps2LocationVps2Location
A CLLocation subclass representing a position computed by VPS2.
MSL altitude is not available from VPS2 without a geoid model conversion, so altitude
always returns -1. Use ellipsoidalAltitude for the WGS84 height computed by VPS2, and
verticalAccuracy (≥ 0) for its precision.

Protocols

NameTypeSummary
NSDKFeatureSessionNSDKFeatureSession
A protocol that defines the common lifecycle and configuration interface for NSDK feature sessions.
NSDKLogCallback : AnyObject AnyObject
Protocol for receiving log messages from NSDK.
Implement this protocol to receive NSDK log messages in your application.
The callback will be invoked on background threads, so ensure your implementation
is thread-safe.
## Example Usage
class MyLogCallback: NSDKLogCallback {
func onLog(level: NSDKLogLevel, message: String, fileName: String?, fileLine: Int, funcName: String?) {
let levelStr = level.description
let location = fileName.map { "\($0):\(fileLine)" } ?? ""
let funcInfo = funcName.map { " \($0)" } ?? ""
print("[NSDK-\(levelStr)] \(location)\(funcInfo): \(message)")
}
}
let callback = MyLogCallback()
let session = NSDKSession(apiKey: "your-key", logCallback: callback)
NSDKSessionDataSource : AnyObject AnyObject
Provides synchronous, pull-based access to the latest available sensor data
required by NSDKSession.
All methods must be non-blocking and thread-safe. Returned values represent
the most recent samples already captured by the underlying services.
NSDKViewDelegate : ARSessionDelegate, PlaybackSessionDelegateARSessionDelegate, PlaybackSessionDelegate
-
PlaybackDatasetSourcePlaybackDatasetSource
Protocol for loading playback dataset data from various sources.
This protocol abstracts data retrieval, allowing for different implementations
such as bundle loading, file system loading, remote loading, or mock data for testing.
Implementations are used for on-demand frame loading - the loader is passed to
PlaybackDataset which calls these methods when frames are requested.
PlaybackSessionDelegate : AnyObject AnyObject
Delegate protocol for receiving frame updates during playback (mirrors ARSessionDelegate-style callbacks).
ResourceOwner : AnyObject AnyObject
-
UIOrientationReporter : AnyObject AnyObject
-

Structs

NameTypeSummary
AreaTargetAreaTarget
Contains a CoverageArea and its associated LocalizationTarget.
AreaTargetResultAreaTargetResult
-
ARUtilsARUtils
Utility functions for AR and device capability detection.
ARUtils provides helper methods for detecting device capabilities
and AR features that are relevant to NSDK functionality.
AssetInfoAssetInfo
Represents asset information from the Sites Manager service.
Maps to proto messages AssetRecord, AssetData, and AssetComputedValues.
AssetMeshDataAssetMeshData
Mesh-specific asset data.
Maps to proto message AssetMeshData.
AssetSplatDataAssetSplatData
Splat-specific asset data.
Maps to proto message AssetSplatData.
AssetVpsDataAssetVpsData
VPS-specific asset data.
Maps to proto message AssetVpsData.
AuthInfoAuthInfo
Authentication information containing token claims.
Contains parsed JWT claims including token string, expiration, user information, and other standard JWT fields.
AwarenessImageParamsAwarenessImageParams
-
CoverageAreaCoverageArea
Represents a geographic area where VPS localization is possible
CoverageAreaResultCoverageAreaResult
Contains all the CoverageArea objects returned by a query to the VPS Coverage service.
GeolocationDataGeolocationData
Struct representing geolocation data including latitude, longitude, altitude,
heading, and orientation.
HintImageResultHintImageResult
Image data returned by a query to a VPS hint image URL.
ImageMathImageMath
Provides affine transformation utilities for image processing.
All affine matrices returned by this class operate in normalized coordinates,
where image space is mapped to the [0, 1] range in both axes with origin at top-left.
LocalizationTargetLocalizationTarget
Represents a real-world point of interest that is a VPS localization target.
VPS localization is more likely to succeed when a localization target is in camera view.
LocalizationTargetResultLocalizationTargetResult
Contains all the LocalizationTarget objects returned by a query to the VPS Coverage service.
MapMetadataMapMetadata
Structure representing the metadata of a device map for visualization and processing.
MeshDownloaderResultMeshDownloaderResult
Represents a single mesh result with geometry, texture, and transform data.
NSDKBufferNSDKBuffer
A buffer containing binary data for NSDK operations.
NSDKBuffer provides a safe wrapper around binary data buffers used by
various NSDK features. It handles memory management and provides convenient
access to buffer data.
## Overview
NSDK buffers are used for:
- Image data transfer
- Mesh data storage
- Configuration data
- Any binary data that needs to be passed between Swift and the native NSDK layer
## Example Usage
// Create buffer from Swift Data
let imageData = UIImage(named: "texture")?.pngData()
let buffer = NSDKBuffer(data: imageData!)
// Access buffer data
print("Buffer size: \(buffer.dataSize) bytes")
// Use buffer with NSDK APIs
let status = someArdkFunction(buffer: buffer)
## Memory Management
NSDKBuffer automatically manages memory allocation and deallocation.
When created from Swift Data, it maintains a reference to prevent premature deallocation.
NSDKFeatureStatusNSDKFeatureStatus
Status flags for NSDK features indicating their current operational state.
NSDKFeatureStatus is an option set that represents various status conditions
for NSDK features like VPS2, scanning, and mapping. Multiple status flags
can be active simultaneously to provide detailed status information.
## Overview
Use this to monitor the health and state of NSDK features:
- Check for errors that need attention
- Monitor initialization progress
- Verify configuration and API key validity
- Ensure features are ready for operation
## Example Usage
let status = vpsSession.getFeatureStatus()
if status.contains(.badApiKey) {
print("Invalid API key - check your credentials")
} else if status.contains(.configurationFailed) {
print("Feature configuration failed")
} else if status.contains(.initializing) {
print("Feature is still initializing...")
} else if status == .ok {
print("Feature is ready and operational")
}
NSDKFrameDataNSDKFrameData
A complete frame of data captured from an AR session.
NSDKFrameData encapsulates all the sensor data, images, and tracking information
from a single AR frame. This includes camera images, depth data, device pose,
GPS location, compass heading, and camera intrinsics.
## Overview
Frame data is the primary input to NSDK for all AR processing tasks including:
- Visual positioning and localization
- 3D scanning and reconstruction
- Map building and tracking
- AR location positioning
## Example Usage
// In your ARSession delegate
func session(_ session: ARSession, didUpdate frame: ARFrame) {
let frameData = NSDKFrameData(
timestampMs: UInt64(frame.timestamp * 1000),
cameraImage: RawImage(from: frame.capturedImage),
depthImage: frame.sceneDepth?.depthMap.flatMap { RawImage(from: $0) },
// ... other data
)
nsdkSession.sendFrame(frameData)
}
NSDKImageNSDKImage
Provides a view into the data buffer of an image output by the NSDK.
NSDKImage provides access to image data in various formats (RGB, grayscale,
depth, etc.) used by NSDK features like depth processing, semantic segmentation,
and image analysis.
## Memory Management
The image provides a non-copying view into native image data managed by NSDK.
Access to the pixel buffer is only valid for the duration of withUnsafeBytes(_:).
The raw pointer passed to the closure must not be stored or used outside its scope.
NSDKInputDataFlagsNSDKInputDataFlags
Flags indicating which types of input data are required by NSDK.
NSDKInputDataFlags is an option set that specifies which data types
should be included in frames sent to NSDK. Use getRequestedDataInputs()
to determine which data is currently needed, then include only the
requested data types in your frame data for optimal performance.
## Overview
NSDK features dynamically request different types of input data based on:
- Which features are active (VPS2, scanning, mapping)
- Current processing state and requirements
- Device capabilities and available sensors
## Example Usage
let requiredInputs = nsdkSession.getRequestedDataInputs()
var frameData = NSDKFrameData()
if requiredInputs.contains(.pose) {
frameData.cameraTransform = currentPose
}
if requiredInputs.contains(.cameraImage) {
frameData.cameraPlane0 = cameraPlane
}
if requiredInputs.contains(.platformDepth) {
frameData.depthData = depthBuffer
}
nsdkSession.sendFrame(frameData)
NSDKPathConfigNSDKPathConfig
-
NSDKUtilsNSDKUtils
Utility functions for NSDK string management and memory handling.
NSDKUtils provides helper methods for safely managing C string conversions
and memory allocation when working with the NSDK C API.
## Overview
The utilities in this struct help manage the complexity of converting between
Swift strings and C strings while ensuring proper memory cleanup and avoiding
memory leaks.
OrganizationInfoOrganizationInfo
Represents organization information from the Sites Manager service.
PlaybackFramePlaybackFrame
One "frame" of playback: metadata (pose, intrinsics, orientation, etc.), optional camera image, optional depth.
Exposes .cameraPlaybackCamera. Created by PlaybackSession and delivered to the delegate and PlaybackRenderer.
Relation to Apple AR: Not an Apple type. Delivered instead of ARFrame when in playback; the app gets
frame-like data (image, camera, etc.) from the delegate so app code can stay uniform.
SceneSegmentationChannelsSceneSegmentationChannels
A set of semantic channels represented as a bitmask, matching the C SDK packed-channel layout.
Use this type wherever you need one or more semantic channels. It supports clean Swift syntax:
let channels: SceneSegmentationChannels = [.ground, .sky]
let single = SceneSegmentationChannels.grass
Bit positions 0–4 correspond to the C SDK channels: Sky=0, Ground=1, NaturalGround=2, ArtificialGround=3, Grass=4.
SiteAssetsInfoSiteAssetsInfo
A single entry in a site-assets location query result.
SiteInfoSiteInfo
Represents site information from the Sites Manager service.
TextureUtilsTextureUtils
-
TimeoutErrorTimeoutError
-
UserInfoUserInfo
Represents user information from the Sites Manager service.
Vps2GeolocationDataVps2GeolocationData
Location and heading data calculated by VPS2.
Vps2LocalizationVps2Localization
Spatial mapping between the device's AR coordinate space and real-world
geolocation, as determined by VPS2.
Vps2LocalizationRequestRecordVps2LocalizationRequestRecord
-
Vps2PoseVps2Pose
Pose in AR coordinate space calculated by VPS2 from a geolocation.
VpsAnchorUpdateVpsAnchorUpdate
Contains the latest tracking information for a VPS anchor.
Anchor updates are retrieved via anchorUpdate(anchorId:) and provide the
most current information about an anchor's position, orientation, and tracking status. This
is a snapshot of the anchor, and the latest anchor update should be used every frame.

Enums

NameTypeSummary
AssetDeploymentTypeAssetDeploymentType
Asset deployment type.
Maps to proto enum AssetDeploymentType.
AssetPipelineJobStatusAssetPipelineJobStatus
Asset pipeline job status.
Maps to proto enum AssetPipelineJobStatus.
AssetStatusTypeAssetStatusType
Asset status.
Maps to proto enum AssetStatusType.
AssetTypeAssetType
Asset type - determines which typed asset data is present.
Maps to proto enum AssetType.
AwarenessErrorAwarenessError
-
ExportResolutionExportResolution
Resolution option for exported scan images.
When exporting a recording, this controls which image resolutions are included in the payload.
HeadingModeHeadingMode
Controls how the heading is computed from the device's orientation.
NSDKAsyncState<Value, Error> where Error : ErrorError
Reports the state of an asynchronous NSDK operation.
NSDKErrorNSDKError
Errors thrown by the NSDK API.
NSDKError a subset of all the ARDK_Statuscodes returned by the C API,
containing just those that can occur in the Swift environment.
NSDKLogLevelNSDKLogLevel
Defines the available logging levels for NSDK.
NSDKLogLevel controls the verbosity of logging output from the NSDK system.
Logging can be configured separately for stdout, files, and callback functions.
## Overview
Log levels follow a hierarchical structure where higher levels include all messages
from lower levels. For example, setting the level to .warn will include warning,
error, and fatal messages, but exclude debug and info messages.
@frozen NSDKScreenOrientationNSDKScreenOrientation
Represents the screen orientation in the NSDK layer.
NSDKTelemetryNSDKTelemetry
Namespace for NSDK telemetry controls.
NSDKTelemetryEnvironmentNSDKTelemetryEnvironment
Telemetry backend environment passed to native on session start.
Use this type instead of a free String so only supported values (dev, stg, prod) compile.
@frozen NsdkTrackingStateNsdkTrackingState
The general quality of position tracking available when the camera captured a frame.
PlaybackDatasetConstantsPlaybackDatasetConstants
Constants for playback dataset file names and extensions.
TypedAssetDataTypedAssetData
Discriminated union for typed asset data.
One of mesh, splat, or vps will be set based on the asset type.
Vps2LocalizationErrorVps2LocalizationError
Possible errors from VPS localization operations.
Vps2LocalizationRequestStatusVps2LocalizationRequestStatus
Status of a network request.
Vps2LocalizationRequestTypeVps2LocalizationRequestType
-
Vps2TrackingStateVps2TrackingState
-
VpsGraphOperationErrorVpsGraphOperationError
-

Methods

NameTypeSummary
playbackSessionvoid
Called on the main queue when the tracking state changes (e.g. at start of playback). For playback, state is typically .normal.