Integrating the SDK into your application
This guide provides a quick reference for integrating the Cognitive3D Analytics SDK into a visionOS application.
Note: Scene and Dynamic Object assets (meshes, textures) are currently uploaded manually. Reach out to the team via Intercom or Discord for help configuring Scene and Dynamic Object content.
Using the Cognitive3D Analytics SDK
You can integrate the SDK in two ways:
- as a Swift package framework
- as source code in your workspace
As a framework
The framework works on both device and the visionOS Simulator.
- Copy the framework and
Package.swiftinto a subfolder.

- Add the framework as a local Swift package.

As source code
You can include analytics directly as source.
- Create a workspace.

- Add your visionOS project to the workspace.
- Add the analytics project to the workspace.

- Add the analytics code as a dependency in your project target.
Additional files for dynamic objects
There are three additional files to add when using dynamic components:

DynamicComponent.swiftImmersiveView+DynamicObject.swift(rename extension type if your immersive view has a different name)DynamicObjectSystem.swift
Import the framework
import Cognitive3DAnalytics
Initial setup in your application
Initialize the SDK at app startup
init() {
cognitiveSDKInit()
}
Configure the SDK (configure is async)
The sceneName, sceneId, versionNumber, and versionId for a scene in your application can be found on the Scene page for your Project on the Cognitive3D dashboard.

The APPLICATION_API_KEY can be found or reset on the Organization Settings page.

fileprivate func cognitiveSDKInit() {
let sceneData = SceneData(
sceneName: "your-scene-name",
sceneId: "your-scene-id",
versionNumber: 1,
versionId: 1234
)
let core = Cognitive3DAnalyticsCore.shared
let settings = CoreSettings()
settings.defaultSceneName = sceneData.sceneName
settings.allSceneData = [sceneData]
settings.apiKey = Bundle.main.object(forInfoDictionaryKey: "APPLICATION_API_KEY") as? String ?? ""
settings.loggingLevel = .all
settings.isDebugVerbose = false
// Optional v1.0.1 settings
settings.sensorAutoSendInterval = 10.0
settings.isHandTrackingRequired = false
core.setParticipantId("participant-id")
core.setParticipantFullName("Participant Name")
Task {
do {
try await core.configure(with: settings)
// Optional runtime behavior
core.config?.shouldEndSessionOnBackground = false
configureDynamicObject()
} catch {
print("Failed to configure Cognitive3D Analytics: \(error)")
}
}
}
Scene phase handling
The SDK still includes .observeCognitive3DScenePhase().
WindowGroup {
ContentView()
.observeCognitive3DScenePhase()
}
If you prefer full manual control, you can handle scenePhase yourself and call endSession() (and set lifecycle options in core.config) according to your app policy.
Dynamic Objects setup
Register components and systems
fileprivate func configureDynamicObject() {
DynamicComponent.registerComponent()
DynamicObjectSystem.registerSystem()
}
Configure dynamic objects in a scene
func configureDynamicObjects(rootEntity: Entity) async {
guard let objManager = Cognitive3DAnalyticsCore.shared.dynamicDataManager else {
return
}
let dynamicEntities = findEntitiesWithComponent(rootEntity, componentType: DynamicComponent.self)
for (_, comp) in dynamicEntities {
await objManager.registerDynamicObject(
id: comp.dynamicId,
name: comp.name,
mesh: comp.mesh
)
}
}
Entity for gaze raycasting
This is required to record gaze against RealityKit entities that have DynamicComponent.
if let rootEntity = try? await Entity(named: "YourScene", in: realityKitContentBundle) {
let core = Cognitive3DAnalyticsCore.shared
core.entity = rootEntity
}
Session management
startSession() and endSession() are async.
Task {
let didStart = await Cognitive3DAnalyticsCore.shared.startSession()
print("Session started: \(didStart)")
}
Task {
let didEnd = await Cognitive3DAnalyticsCore.shared.endSession()
print("Session ended: \(didEnd)")
}
Reactive session events (v1.0.1)
import Combine
private var cancellables = Set<AnyCancellable>()
Cognitive3DAnalyticsCore.shared.sessionEventPublisher
.sink { event in
switch event {
case .started(let sessionId):
print("Session started: \(sessionId)")
case .ended(let sessionId, let state):
print("Session ended: \(sessionId), state: \(state)")
}
}
.store(in: &cancellables)
Optional hand tracking
If hand tracking is required, set:
settings.isHandTrackingRequired = true
On device (not simulator), the SDK will request authorization and start hand tracking when configured/session starts.
Creating and sending Custom Events
Custom Events track app-specific actions in a session timeline.
func createCustomEvent(dynamicId: String) {
let event = CustomEvent(
name: "tapEvent",
properties: ["timestamp": Date().timeIntervalSince1970],
dynamicObjectId: dynamicId,
core: Cognitive3DAnalyticsCore.shared
)
_ = event.send()
}
Working with ExitPoll surveys
let exitPollSurvey = ExitPollSurveyViewModel()
exitPollSurvey.loadSurvey(hook: "your-survey-hook")
Task {
let result = await exitPollSurvey.sendSurveyAnswers()
print(result)
}
Changing scenes
Cognitive3DAnalyticsCore.shared.setSceneById(
sceneId: "new-scene-id",
version: 1,
versionId: 1234
)
If you have a question or any feedback about our documentation please use the Intercom button (purple circle) in the lower right corner of any web page or join our Discord.