Using the Rotational Device Tracker

Vuforia supports rotational head tracking to enable the development of immersive Virtual Reality (VR) experiences. This functionality allows the creation of apps that can track the rotation of a user's head, when using a VR viewer, and also supports handheld device tracking.

Rotational Tracking is enabled by the RotationalDeviceTracker class, which is a a specialized Tracker type that reports the rotational pose of a handheld or head mounted device. The Rotational Device Tracker can also be configured with a rotational pivot offset to adjust the pose values to coincide with the center of rotation of a user's head or arm.

This article describes the RotationalDeviceTracker API using examples from the native Android and iOS SDKs. For more information on how to use device tracking in Unity see: Using the MixedRealityController in Unity

Device Tracking

The DeviceTracker class tracks the rotational pose of a device within a world coordinate system using data from the device's inertial sensors.

The RotationalDeviceTracker, DeviceTrackable, and DeviceTrackableResult types



The RotationalDeviceTracker provides pose updates as a DeviceTrackableResult for a single DeviceTrackable instance. These Trackable Results are obtainable from the State object either by calling updateState() from the StateUpdater class instance on the TrackerManager, or from the State object returned by Renderer.begin().

Obtaining State from the StateUpdater:

const Vuforia::State state = Vuforia::TrackerManager::getInstance().getStateUpdater().updateState();

Rotational Device Tracker Functionality

RotationalDeviceTracker derives from the DeviceTracker class and supports Head Tracking using an internal HeadModel to define the rotational offset of the device in relation to the center of the user's head.

The DeviceTracker API supports the following functionality:

  • Recenter Mode: This capability enables you to reset the origin of the pose heading to the direction that the device is currently facing. After calling this method, subsequent pose results will use the recentered pose as a reference position. Recentering enables you to set a new reference direction for the device without affecting the reported orientation of your device's horizon ( i.e. pitch and roll ).
    • NOTE: Recentering of the pose heading should only be executing when starting and stopping the Device Tracker.
  • Model Correction: A head or hand pivot model can be configured to transform the reported DeviceTrackableResult to reflect the rotational center of the provided model.
    • NOTE: Model Correction should only be defined and set when initializing the DeviceTracker and before starting the Tracker.
  • Pose prediction: When enabled, pose prediction can improve the accuracy of pose results for VR apps by incorporating model correction, device trajectory, and system latency into the rotational pose estimate. Pose prediction is not supported for AR apps.

Two predefined rotational offset models are provided that will configure the DeviceTrackableResult for specific pivot offsets:

  • Head Model: A neck pivot transform model defined based on ergonomics standards.
  • Handheld Model: An arm pivot transform model based on ergonomics standards.
RotationalDeviceTracker Class Diagram

 

Inertial Device Tracker Workflow

The following code examples below show how to initialize, update, and deinitialize the tracker.

1. Initializing and starting the tracker
2. Configuring for VR mode
3. Obtaining a Device Trackable Result
4. Applying the trackable pose for rendering
5. Stopping and deinitializing the tracker

Example: Default configuration - Init and start the device tracker.

void init()
{
    // Init Vuforia ...
    int code = Vuforia::init();
    // ..
 
    Vuforia::TrackerManager& trackerManager = Vuforia::TrackerManager::getInstance();
    deviceTracker = static_cast<Vuforia::RotationalDeviceTracker*>(
            trackerManager.initTracker(Vuforia:: RotationalDeviceTracker::getClassType()));

    // start the tracker
    deviceTracker->start();
}

Example: Basic VR Mode Configuration - enable pose prediction and head model correction.

void init()
{
    // Init Vuforia ...
    int code = Vuforia::init();
    // ..

    Vuforia::TrackerManager& trackerManager = Vuforia::TrackerManager::getInstance();
    deviceTracker = static_cast<Vuforia:: RotationalDeviceTracker*>(
            trackerManager.initTracker(Vuforia:: RotationalDeviceTracker::getClassType()));

    // activate pose prediction
    deviceTracker->setPosePrediction(true);

    // activate model correction: default neck model
    deviceTracker->setModelCorrectionMode(deviceTracker->getDefaultHeadModel());

    // start the tracker
    deviceTracker->start();
}


Example: Obtaining the DeviceTrackableResult

void render()
{        
    const Vuforia::State state = Vuforia::TrackerManager::getInstance().getStateUpdater().updateState();

    for (int r = 0; r < state.getNumTrackableResults(); ++ r)
    {
        const Vuforia::TrackableResult* trackableResult = state.getTrackableResult(r);
        Vuforia::Matrix34F trackablePose = trackableResult->getPose();
        Vuforia::Matrix44F trackablePoseGLMatrix = Vuforia::Tool::convertPose2GLMatrix(trackablePose);
 
        if (trackableResult->isOfType(Vuforia::DeviceTrackableResult::getClassType())) {
            const Vuforia::DeviceTrackableResult* deviceTrackableResult = 
                    static_cast<const Vuforia::DeviceTrackableResult*>(trackableResult);
     
            // base device matrix that can be used for rendering (will need to be inverted), debug
            deviceMatrix = trackablePoseGLMatrix;
        }
        else {
            // standard trackable: in VR can be interactive targets      
            interactionViewMatrix = trackablePoseGLMatrix;
 
            // Convert between camera space and world space      
            Vuforia::Matrix44F convertCS;
 
            MathUtils::makeRotationMatrix(180.0f, Vuforia::Vec3F(1.0f, 0.0f, 0.0f), convertCS);
            MathUtils::multiplyMatrix(convertCS, interactionViewMatrix, interactionViewMatrix);
 
            isInteractionTargetVisible = true;
        }
    }
}


Example: Applying the Device Tracker pose for rendering

void render()
{        
    const Vuforia::State state = Vuforia::TrackerManager::getInstance().getStateUpdater().updateState();
 
    // Get last device trackable pose    
    for (int r = 0; r < state.getNumTrackableResults(); ++ r)
    {
        const Vuforia::TrackableResult* trackableResult = state.getTrackableResult(r);
        Vuforia::Matrix34F trackablePose = trackableResult->getPose();
        Vuforia::Matrix44F trackablePoseGLMatrix = Vuforia::Tool::convertPose2GLMatrix(trackablePose);
 
        if (trackableResult->isOfType(Vuforia::DeviceTrackableResult::getClassType()))
        {
            const Vuforia::DeviceTrackableResult* deviceTrackableResult = 
                    static_cast<const Vuforia::DeviceTrackableResult*>(trackableResult);
     
              // base device matrix that can be used for rendering (will need to be inverted), debug
              deviceViewMatrix = trackablePoseGLMatrix;
        }
        else
        {
            // standard trackable: in VR can be interactive targets      
            interactionViewMatrix = trackablePoseGLMatrix;
             
            // Convert between camera space and world space      
            Vuforia::Matrix44F convertCS;
 
            MathUtils::makeRotationMatrix(180.0f, Vuforia::Vec3F(1.0f, 0.0f, 0.0f), convertCS);
            MathUtils::multiplyMatrix(convertCS, interactionViewMatrix, interactionViewMatrix);
 
            isInteractionTargetVisible = true;
        }
    }
 
    // Use device trackable pose for doing the rendering

    // Get projection matrix for this specific eye    
    Vuforia::Matrix44F eyeProjectionMatrix = vrRenderer.getEyeProjectionMatrix(eyeType,NEAR_PLANE,FAR_PLANE);  

    // get the view matrix from the device pose (deviceViewMatrix)
    Vuforia::Matrix44F modelViewMatrix = MathUtils::Matrix44FTranspose(MathUtils::Matrix44FInverse(deviceViewMatrix)); 

    // adjust for IPD shift (transformation device to eye)  
    Vuforia::Matrix44F adjustEye = vrRenderer.getEyeAdjustmentMatrix(eyeType);  
 
    // Combine the matrices to obtain the world to eye transformation   
    MathUtils::multiplyMatrix(adjustEye, modelViewMatrix, modelViewMatrix);  
 
    // render the scene
    renderScene(eyeProjectionMatrix, modelViewMatrix, isInteractionTargetVisible);
}


Example: Stop and deinitialize the Device Tracker

void shutDown()
{
    Vuforia::TrackerManager& trackerManager = Vuforia::TrackerManager::getInstance();
 
    // stop the tracker
    deviceTracker->stop();
 
    // Deinit the tracker
    trackerManager.deinitTracker(Vuforia::RotationalDeviceTracker::getClassType());
 
    // Deinit Vuforia
    Vuforia::deinit();
}