Custom sensors
Sensors capture continuous or periodic data streams, values that change over time and are sampled at regular intervals. Unlike events (discrete moments) or properties (persistent state), sensors record time-series data that can be visualized as graphs and analyzed for trends.
Each sensor reading consists of:
- A sensor name identifying what's being measured
- A numeric value representing the current reading
- An automatic timestamp (provided by Cognitive3D)
Sensors are ideal for data that would generate too many events if tracked discretely, or where the pattern over time is more meaningful than individual moments.
Use sensors to track:
- Application state metrics — Object count, participant count, active feature count
- Environmental readings — Room brightness (if available), spatial anchor stability
- User physiological data — Heart rate, stress indicators (if using compatible hardware)
The SDK automatically records FPS (frames per second) as a default sensor.
Basic Usage
Record a single sensor reading:
Cognitive3DManager.recordSensor("loaded_model_count", 5.0f)
Cognitive3DManager.recordSensor("loaded_model_count", 5.0f);
The value must be a Float. The sensor name identifies the metric and will be used to group readings over time.
Continuous Recording with Coroutines
For metrics that should be sampled regularly, use a coroutine with a delay loop:
private var sensorJob: Job? = null
fun startWorkspaceComplexitySensor() {
sensorJob = lifecycleScope.launch {
while (isActive) {
val modelCount = workspaceManager.loadedModels.size.toFloat()
Cognitive3DManager.recordSensor("loaded_model_count", modelCount)
delay(5000) // Sample every 5 seconds
}
}
}
fun stopWorkspaceComplexitySensor() {
sensorJob?.cancel()
sensorJob = null
}
private ScheduledExecutorService sensorExecutor;
private ScheduledFuture<?> complexitySensorTask;
public void startWorkspaceComplexitySensor() {
sensorExecutor = Executors.newSingleThreadScheduledExecutor();
complexitySensorTask = sensorExecutor.scheduleAtFixedRate(() -> {
float modelCount = (float) workspaceManager.getLoadedModels().size();
Cognitive3DManager.recordSensor("loaded_model_count", modelCount);
}, 0, 5, TimeUnit.SECONDS); // Sample every 5 seconds
}
public void stopWorkspaceComplexitySensor() {
if (complexitySensorTask != null) {
complexitySensorTask.cancel(false);
}
if (sensorExecutor != null) {
sensorExecutor.shutdown();
}
}
Examples
Use sensors when you need to track values over time rather than at a single moment. These examples show how to monitor metrics that change throughout a session.
Tracking workspace complexity
As your scene grows with more models and participants, performance may degrade. Recording these metrics lets you correlate frame drops or lag with specific complexity thresholds:
fun startWorkspaceMetrics() {
sensorJobs += lifecycleScope.launch {
while (isActive) {
// Content metrics
Cognitive3DManager.recordSensor(
"loaded_model_count",
loadedModels.size.toFloat()
)
Cognitive3DManager.recordSensor(
"annotation_count",
annotations.size.toFloat()
)
Cognitive3DManager.recordSensor(
"active_tool_count",
activeTools.size.toFloat()
)
// Collaboration metrics
Cognitive3DManager.recordSensor(
"participant_count",
collaborators.size.toFloat()
)
// Compute total polygon count across all models
val totalPolygons = loadedModels.sumOf { it.polygonCount }
Cognitive3DManager.recordSensor(
"total_polygon_count",
totalPolygons.toFloat()
)
delay(5000) // Every 5 seconds
}
}
}
public void startWorkspaceMetrics() {
ScheduledFuture<?> task = sensorExecutor.scheduleAtFixedRate(() -> {
// Content metrics
Cognitive3DManager.recordSensor(
"loaded_model_count",
(float) loadedModels.size()
);
Cognitive3DManager.recordSensor(
"annotation_count",
(float) annotations.size()
);
Cognitive3DManager.recordSensor(
"active_tool_count",
(float) activeTools.size()
);
// Collaboration metrics
Cognitive3DManager.recordSensor(
"participant_count",
(float) collaborators.size()
);
// Compute total polygon count across all models
long totalPolygons = 0;
for (LoadedModel model : loadedModels) {
totalPolygons += model.getPolygonCount();
}
Cognitive3DManager.recordSensor(
"total_polygon_count",
(float) totalPolygons
);
}, 0, 5, TimeUnit.SECONDS); // Every 5 seconds
sensorTasks.add(task);
}
Tracking user interaction state
Capture what users are actively doing, selections, zoom levels, and edit history depth. This helps identify idle periods versus high-activity moments during a session:
fun startInteractionMetrics() {
sensorJobs += lifecycleScope.launch {
while (isActive) {
// Selection state
Cognitive3DManager.recordSensor(
"selected_object_count",
selectionManager.selectedObjects.size.toFloat()
)
// Current view/zoom level (useful for understanding focus)
Cognitive3DManager.recordSensor(
"camera_zoom_level",
cameraController.zoomLevel
)
// Active layers (if your app has visibility layers)
Cognitive3DManager.recordSensor(
"visible_layer_count",
layerManager.visibleLayers.size.toFloat()
)
// Undo stack depth (indicates edit activity)
Cognitive3DManager.recordSensor(
"undo_stack_depth",
undoManager.stackSize.toFloat()
)
delay(2000) // Every 2 seconds
}
}
}
public void startInteractionMetrics() {
ScheduledFuture<?> task = sensorExecutor.scheduleAtFixedRate(() -> {
// Selection state
Cognitive3DManager.recordSensor(
"selected_object_count",
(float) selectionManager.getSelectedObjects().size()
);
// Current view/zoom level (useful for understanding focus)
Cognitive3DManager.recordSensor(
"camera_zoom_level",
cameraController.getZoomLevel()
);
// Active layers (if your app has visibility layers)
Cognitive3DManager.recordSensor(
"visible_layer_count",
(float) layerManager.getVisibleLayers().size()
);
// Undo stack depth (indicates edit activity)
Cognitive3DManager.recordSensor(
"undo_stack_depth",
(float) undoManager.getStackSize()
);
}, 0, 2, TimeUnit.SECONDS); // Every 2 seconds
sensorTasks.add(task);
}
Best Practices
Naming conventions
- Include units in sensor names:
latency_ms,memory_mb,confidence_percent - Use lowercase with underscores
- Be specific:
left_hand_confidencenot justhand_confidence
Performance considerations
- Each sensor reading has overhead—don't sample more frequently than necessary
- Batch related sensors at the same interval when possible
- Consider reducing frequency for non-critical metrics in performance-sensitive scenarios
What to track
- Metrics that change over time and where trends matter
- Quality/health indicators that might correlate with issues
- Counts and gauges that help characterize the session
What to skip
- Values that rarely change (use session properties instead)
- Data with many unique values (use events with properties instead)
- Sensitive user data
If you have a question or any feedback about our documentation please use the Intercom button (purple circle) in the lower right corner of any web page or join our Discord.