com.rwth.unity.omilaxr.xapi

1.0.8 • Public • Published

OmiLAXR.xAPI: A learning analytics module for virtual reality using Unity and xAPI

Dependencies

Getting started

  1. Install UnityMainThreadDispatcher package
  2. Create the folder Plugins in your Assets folder
  3. Download TinCan.NET dll from here
  4. Place TinCan.dll in the Plugins folder
  5. Install xAPI4Unity and use it to create a xAPI.Registry script folder
  6. Install OmiLAXR.xAPI (e.g. as package, see here)
  7. Enjoy

Tracking Systems (Features)

The module is designed to work with the following tracking systems:

  • MainTrackingSystem: Should be included. This controls the overall xAPI tracking.
  • EyeTrackingSystem: This system controls the eye tracking mechanisms.
  • GestureTrackingSystem: This system controls the gesture tracking mechanisms like nodding and shaking.
  • InteractionTrackingSystem: This system controls the interaction tracking mechanisms like controller buttons, interactables, laser pointer, mouse and keyboard.
  • SceneTrackingSystem: This system controls the scene tracking mechanisms like scene changes, timeline or player observerations.
  • TeleportTrackingSystem: This system controls the teleport tracking mechanisms like teleport points and teleport areas.

Each tracking system has several tracking controllers which are responsible for the tracking of the different tracking mechanisms. For example, the InteractionTrackingSystem has following tracking controllers:

  • ActionController: Tracks the controller buttons.
  • InteractionController: Tracks the interactables.
  • KeyboardController: Tracks the keyboard presses.
  • MouseInteractionController: Tracks the mouse clicks and position.
  • ...

Trackables

If you want to make a GameObject trackable, you have to add the corresponding component to the GameObject.

The MainTrackingSystem owns a TrackableRegister component, which shows all trackable GameObjects of the scene. You can enable and disable them also from this register.

Creating own tracking systems

To create own tracking systems, the following steps are necessary:

  1. Create a new component which inhertiates from the class TrackingSystemBase.
  2. Implement the abstract methods.
  3. Add the component to a new Game Object.
  4. Your tracking system will be automatically added to the component to the MainTrackingSystem.
  5. Implement the tracking controllers for your tracking system and register them inside your tracking system by using AddTrackingController<T>().
  6. Each controller have to overwrite the function Consider(GameObject go).

Example of a consider function: In this example, the game object will not be considered for the tracking controller, if the trackable is not allowed for using Gesture Interacting.

public override void Consider(GameObject go)
{
    // Skip not pointable, interactable or trackable objects
    var trackable = go.GetComponent<Trackable>();
    if (trackable == null || !trackable.Has(Gestures.Interacting))
        return;
    ConsiderList.Consider(trackable);
}

General

This module allows to store learning data from virtual environments, which have been created with Unity, in a learning record store in Experience API (xAPI) format.

So far the following gestures are examined and stored (outdated information)

contexts ET (eyeTracking), Gn (generic), Gs (gestures), SG (seriousGames), SC (systemControl) and VR (virtualReality)

extension types ac (activity), ctx (context), res (result)

When triggered Verb Activity Extensions
Used controller button pressed(VR)/released(VR) action(VR) actionName(VR,ac), hand(Gs,ac)
Pointer enters object pointed(VR) vrObject(VR) vrObjectName(VR,ac)
Press trigger while pointing on object interacted(VR) vrObject(VR) vrObjectName(VR,ac)
Timeline playable plays/pauses/stops/resumes started(SC)/paused(SC)/ended(SC)/resumed(SC) stage(SC) name(SC,ac)
Visits/leave teleport point entered(Gn)/left(Gn) teleportPoint(VR) vrObjectName(VR)
Teleports in area teleported(VR) player(Gn) position(SG,res)
Changes Unity scene teleported(VR) player(Gn) position(SG,res), level(SG,ctx)
Opens/closes Unity started(SG)/ended(SG) game(SG) game(SG,ctx), gamemode(SG,ctx)

(... and some more which are not documented, yet ... )


NOTE

The current state of the art only supports the library when implementing virtual environments with the IDE Unity and the virtual relaity system HTC Vive Pro Eye. An extension with another IDE or other VR systems is also possible at any time.

There are plans to make this framework also compatible for UnityXR, VRTK and MRToolkit.

Compatibility state


NOTE

The whole implementation of the module is done with Unity Version 2021.3.21f1. The use of the module for other versions is not guaranteed. Instructions only garanteed for Windows 10 and 11!


Usage

Each time learning data with this module should be collected and the collected data should include eye tracking data as the tracking of gameObjects, a calibration of the eye tracking systems needs to be done. This can either be done before starting the application via the SteamVR dashboard or directly after all necessary setup settings have been done via a button in the application.

Calibration


NOTE

Calibration needs to be done every time the HMD wearer is changed. (Re-)Calibration not compulsory if user remains the same and HMD is put back on.


Via SteamVR dahsboard:

  1. Open SteamVR dashboard (click SystemButton)
  2. Turn on Use Eye Tracking
  3. Click on Calibrate
  4. Follow VIVE Pro Eye Setup guide

Via terminal/command line:

  1. Start application
  2. Press LaunchCalibration button
  3. Follow VIVE Pro Eye Setup guide

Contributions

Special thanks to Annabell Brocker. The initial idea of this project was created in scope of her master thesis.

Readme

Keywords

Package Sidebar

Install

npm i com.rwth.unity.omilaxr.xapi

Weekly Downloads

1

Version

1.0.8

License

none

Unpacked Size

435 kB

Total Files

232

Last publish

Collaborators

  • slc.mattiussi
  • sgoerzen