OmiLAXR.xAPI: A learning analytics module for virtual reality using Unity and xAPI
Dependencies
- UnityMainThreadDispatcher: https://github.com/SGoerzen/UnityMainThreadDispatcher (dependency will be removed soon)
- TinCan.NET: https://github.com/SGoerzen/TinCan.NET
- Needed a xAPI Registry with the namespace
xAPI.Registry
(can be created with xAPI4Unity package)
Getting started
- Install UnityMainThreadDispatcher package
- Create the folder
Plugins
in yourAssets
folder - Download TinCan.NET dll from here
- Place
TinCan.dll
in thePlugins
folder - Install xAPI4Unity and use it to create a xAPI.Registry script folder
- Install OmiLAXR.xAPI (e.g. as package, see here)
- Enjoy
Tracking Systems (Features)
The module is designed to work with the following tracking systems:
- MainTrackingSystem: Should be included. This controls the overall xAPI tracking.
- EyeTrackingSystem: This system controls the eye tracking mechanisms.
- GestureTrackingSystem: This system controls the gesture tracking mechanisms like nodding and shaking.
- InteractionTrackingSystem: This system controls the interaction tracking mechanisms like controller buttons, interactables, laser pointer, mouse and keyboard.
- SceneTrackingSystem: This system controls the scene tracking mechanisms like scene changes, timeline or player observerations.
- TeleportTrackingSystem: This system controls the teleport tracking mechanisms like teleport points and teleport areas.
Each tracking system has several tracking controllers which are responsible for the tracking of the different tracking mechanisms. For example, the InteractionTrackingSystem has following tracking controllers:
- ActionController: Tracks the controller buttons.
- InteractionController: Tracks the interactables.
- KeyboardController: Tracks the keyboard presses.
- MouseInteractionController: Tracks the mouse clicks and position.
- ...
Trackables
If you want to make a GameObject trackable, you have to add the corresponding component to the GameObject.
The MainTrackingSystem owns a TrackableRegister component, which shows all trackable GameObjects of the scene. You can enable and disable them also from this register.
Creating own tracking systems
To create own tracking systems, the following steps are necessary:
- Create a new component which inhertiates from the class TrackingSystemBase.
- Implement the abstract methods.
- Add the component to a new Game Object.
- Your tracking system will be automatically added to the component to the MainTrackingSystem.
- Implement the tracking controllers for your tracking system and register them inside your tracking system by using
AddTrackingController<T>()
. - Each controller have to overwrite the function
Consider(GameObject go)
.
Example of a consider function: In this example, the game object will not be considered for the tracking controller, if the trackable is not allowed for using Gesture Interacting.
public override void Consider(GameObject go)
{
// Skip not pointable, interactable or trackable objects
var trackable = go.GetComponent<Trackable>();
if (trackable == null || !trackable.Has(Gestures.Interacting))
return;
ConsiderList.Consider(trackable);
}
General
This module allows to store learning data from virtual environments, which have been created with Unity, in a learning record store in Experience API (xAPI) format.
So far the following gestures are examined and stored
(outdated information)
contexts ET (eyeTracking), Gn (generic), Gs (gestures), SG (seriousGames), SC (systemControl) and VR (virtualReality)
extension types ac (activity), ctx (context), res (result)
When triggered | Verb | Activity | Extensions |
---|---|---|---|
Used controller button | pressed(VR)/released(VR) | action(VR) | actionName(VR,ac), hand(Gs,ac) |
Pointer enters object | pointed(VR) | vrObject(VR) | vrObjectName(VR,ac) |
Press trigger while pointing on object | interacted(VR) | vrObject(VR) | vrObjectName(VR,ac) |
Timeline playable plays/pauses/stops/resumes | started(SC)/paused(SC)/ended(SC)/resumed(SC) | stage(SC) | name(SC,ac) |
Visits/leave teleport point | entered(Gn)/left(Gn) | teleportPoint(VR) | vrObjectName(VR) |
Teleports in area | teleported(VR) | player(Gn) | position(SG,res) |
Changes Unity scene | teleported(VR) | player(Gn) | position(SG,res), level(SG,ctx) |
Opens/closes Unity | started(SG)/ended(SG) | game(SG) | game(SG,ctx), gamemode(SG,ctx) |
(... and some more which are not documented, yet ... )
NOTE
The current state of the art only supports the library when implementing virtual environments with the IDE Unity and the virtual relaity system HTC Vive Pro Eye. An extension with another IDE or other VR systems is also possible at any time.
There are plans to make this framework also compatible for UnityXR, VRTK and MRToolkit.
Compatibility state
- SteamVR 100%: Install SteamVR Adapter
- UnityXR 75%: Install UnityXR Adapter
- MR Toolkit 0%
- VRTK 0%
NOTE
The whole implementation of the module is done with Unity Version 2021.3.21f1. The use of the module for other versions is not guaranteed. Instructions only garanteed for Windows 10 and 11!
Usage
Each time learning data with this module should be collected and the collected data should include eye tracking data as the tracking of gameObjects, a calibration of the eye tracking systems needs to be done. This can either be done before starting the application via the SteamVR dashboard or directly after all necessary setup settings have been done via a button in the application.
Calibration
NOTE
Calibration needs to be done every time the HMD wearer is changed. (Re-)Calibration not compulsory if user remains the same and HMD is put back on.
Via SteamVR dahsboard:
- Open SteamVR dashboard (click SystemButton)
- Turn on Use Eye Tracking
- Click on Calibrate
- Follow VIVE Pro Eye Setup guide
Via terminal/command line:
- Start application
- Press LaunchCalibration button
- Follow VIVE Pro Eye Setup guide
Contributions
Special thanks to Annabell Brocker. The initial idea of this project was created in scope of her master thesis.