There are five core components of our analytics SDK:
- Projects - can be thought of as silos of data. A single VR app would be a Project.
- Scenes - almost identical to Unity Scenes. Scenes are where participant sessions exist. Scenes contain static geometry.
- Participant - the person interacting with your experience - your end user. This might be an employee going through training, a person from your focus group, etc.
- Dynamic Objects - 3D objects that you would like to track. Dynamic Objects contain their own gaze and movement data. A product on a shelf of a retail scene would be a Dynamic Object, but a permanent, non-moving cash register would not.
- Gaze - Participant gaze is recorded every 10 milliseconds by default. Gaze includes participant position and the exact 3D position a participant looked at. If a participant Gazes at a Dynamic Object, that Gaze is assigned to that Object.
- Events - Participant actions that are recorded to our cloud. Events can be anything from "Purchased Product" to "Training Completed". Each Event can have sub-properties, such as "Product Name" or "Training Type".
You should also be familiar with many Unity Specific terms, including:
- Canvas - a component on a GameObject used to display a User Interface.
- Prefab - an asset in Unity that defines a pre-authored GameObject that has common properties between each instance.