What's New in Vuforia Engine 5.5

Vuforia introduces the following important improvements to Vuforia Engine 5.5:

  • Device Tracking

  • Metal API support on iOS

  • Camera API improvements

In addition, this release also includes name changes to reflect the new home of  Vuforia Engine. All references to Qualcomm, and the legacy name “QCAR”, have been replaced with "Vuforia" in the Vuforia APIs and library files.

Index

  1. AR/VR Support
  2. Device Classes
  3. Viewer Parameters
  4. VR Rendering
  5. Rotational Device Tracking
  6. iOS Metal Support
  7. Core API Changes
  8. Trademark Updates

AR/VR Support

Please note that VR is not supported from the Vuforia Engine SDK 8.5 release along with substantial changes to the digital eyewear support for handheld devices and the Hololens. See the release notes for more information.

Vuforia Engine 5.5 introduces the following new categories of functionality that support the development of immersive AR/VR applications:

  • Device Tracking: a new high-level Device Tracker class, and the introduction of a Rotational Device Tracker to support head tracking.
  • Stereo Rendering: new Rendering Primitives to provide a simpler renderer agnostic API for AR and VR rendering.
  • Devices: a new abstract class to represent Device hardware, this incorporates existing Optical See-Through Digital Eyewear devices as well the introduction of a new abstraction for AR/VR Viewers, such as Gear VR and Cardboard.
  • Spatial and Temporal Accuracy: More control and accuracy in obtaining spatial (coordinate system) and temporal (timestamp) information as well as a on-demand pose updates.
  5.0 Release New Release
Spatial: Frame of Reference Camera Only Camera + World Coordinate System
Temporal: Timestamp Internal Public
State Update Pushed at 30fps Pushed and also queryable on-demand
Tracking Camera-based Camera and also inertial sensor based
when using the Device Tracker classes
Rendering Legacy Rendering API
for AR rendering
Rendering Primitives API for agnostic
AR/VR rendering
Device Abstraction Device and camera specific Abstracted for all AR/VR devices

Spatial and Temporal Frame of Reference

Spatial Frame of Reference

The 6DOF poses reported by our target trackers have been defined in the Camera's Frame of reference (FOR). This is the convention for defining a coordinate system (CS) for computer vision applications. In this way, the target pose defines the transformation from the target coordinate system to the camera coordinate system.

Frames of Reference: Camera CS, Target CS, trackable pose

To support VR applications, we have introduced a new world frame of reference and its derived world coordinate system. This coordinate system uses a conventional “GL” frame of reference, which is familiar to 3D and graphics developers. This is the frame of reference for tracking Devices when using the Device Tracker.


New Frames of Reference: World CS, (Device) Target CS



Global and local coordinate systems

Coordinate System definitions are available in the Vuforia.h header file:

/// Types of coordinate frames and frame of references
enum COORDINATE_SYSTEM_TYPE {
COORDINATE_SYSTEM_UNKNOWN = 0, ///< Unknown coordinate system
COORDINATE_SYSTEM_CAMERA = 1, ///< Pose will be relative to the camera frame of reference
COORDINATE_SYSTEM_WORLD = 2, ///< Pose will be relative to the world frame of reference
};

The getCoordinateSystem() method has been introduced to report which base coordinate system a trackable result pose is defined in.

class Vuforia_API TrackableResult : private NonCopyable
{
public:
...
virtual const Matrix34F& getPose() const = 0;
 
/// Returns the base coordinate system defined for the pose
virtual COORDINATE_SYSTEM_TYPE getCoordinateSystem() const = 0;
 
virtual ~TrackableResult() {}
};

This table summarizes the base coordinate system used for each type of TrackableResult pose:

Trackable Result Type Base Coordinate System
Object Target Result (Image Target Result, Cylinder Target Result, Multi Target Result) Camera Coordinate System
Virtual Button Result Camera Coordinate System
Marker Result Camera Coordinate System
Smart Terrain Result (Prop Result, Surface Result) Camera Coordinate System
Word Result Camera Coordinate System
Device Trackable Result (parent, abstract) World Coordinate System
Rotational Device Trackable Result World Coordinate System


The DeviceTracker provides a pose for the DeviceTrackable in the World Coordinate System. This differs from other trackers that operate in the camera coordinate system.

Note: The pose of the DeviceTrackable is affected by the application's device orientation (landscape, portrait, etc.) because Vuforia Engine uses the application orientation to define the “up” direction of the physical device.

Temporal Frame of Reference

The DeviceTracker reports poses that are independent of the camera frame, and thus the camera frame time stamp. In the past, Trackable pose results were always synced with the camera frame. In Vuforia Engine 5.5, we have decoupled the TrackableResult from the frame timestamp by providing it with its own timestamp. This enables a TrackableResult timestamp to be generated independently of the camera frame rate.

Timestamp generation:

State Object Timestamp definition
Camera Frame Capture time
Non-Device Trackable Results
(Object Target Result, Image Target Result, Cylinder Target Result, Multi Target Result, Virtual Button Result, Marker Result, Prop Result, Surface Result, Word Result)

Synced with Camera Frame at capture time.

Note: For specific eyewear devices, with pose prediction enabled, this stampe may be offset for the Predicted time.

Rotational Device Trackable Result

Queryable - independent of capture time.

Note: For specific eyewear devices, with pose prediction enabled, this stampe may be offset for the Predicted time.

The TrackableResult class has been extended to support time stamps as follows:

class Vuforia_API TrackableResult : private NonCopyable
{
public:
..
/// A time stamp that defines when the trackable result was generated
/**
* Value in seconds representing the offset to application startup time.
* The timestamp can be used to compare trackable results.
*/
virtual double getTimeStamp() const = 0;
 
/// Returns the current pose matrix in row-major order
/**
* A pose is defined in a base coordinate system and defines a transformation
* from a target coordinate system to a base coordinate system.
*/
virtual const Matrix34F& getPose() const = 0;
…
};

The Vuforia API also provides a reference time that can be queried from the StateUpdater class:

class StateUpdater : private NonCopyable
{
public:
 
/// Attempts to update the State from latest tracking data and returns it
/**
* Integrates latest available inertial measurements to create the most
* up-to-date State instance. Note that the State created may contain poses
* that are no longer in sync with the last camera frame. Thus this function's
* primary use cases are VR as well as AR on see-through Eyewear devices
* where tight visual registration with a rendered video background is not
* required. On devices where inertial/predictive tracking is not available
* updateState() will simply return the latest camera-based state.
*/
virtual State updateState() = 0;
 
/// Accessor for the last state created from updateState()
 virtual State getLatestState() const = 0;
 
/// Return the current time stamp
/**
* Return the current time stamp using similar measurement unit and reference
* as for state objects (e.g. frame, trackable results).
*/
 virtual double getCurrentTimeStamp() const = 0;
};

“Just In Time” State Update

The StateUpdater class was introduced with Vuforia Engine 5.0. This new state class offered better access to TrackerManager's State by two methods:

  • updateState(): query an update from the SDK with a new state,
  • getLatestState(): get the state retrieved from the last call to updateState().

With Vuforia Engine 5.5, updateState() enables the DeviceTracker to integrate the latest inertial measurements and calculate the latest updated pose.

See: Using the Rotational Device Tracker

 

Device and EyewearDevice

NOTE: that from Vuforia Engine SDK 8.5 and later, EyewearDevice class related parameters have been simplified. Please refer to the release notes for more information.

Vuforia Engine 5.5 introduces two new classes that provide abstractions of the device hardware that an app is running on.

The Device class is a singleton that allows the developer to set the application MODE to either AR or VR, to select and activate VR Viewers, and also retrieve Rendering Primitives.

The EyewearDevice class is available when a supported eyewear device is detected. The EyewearDevice class allows the developer to control the display mode (e.g. clone or extended), enable and disable predictive tracking on devices that support prediction, and access user calibration functions when running on a see-through eyewear device.

Note: The Device and EyewearDevice classes replace the Eyewear class found in Vuforia Engine 5.0 and earlier.

Class detail

Vuforia Engine defines a single Device class instance to represent the hardware device that the app is running on. This instance may be defined as either a Device (e.g. a smartphone or tablet) or an EyewearDevice (e.g. an optical see-through device).

An EyewearDevice instance will only be provided when Vuforia Engine is initialized with a Digital Eyewear license and a supported model of eyewear hardware is detected. Device types can be identified using the existing getClassType, getType, isOfType methods.

The following class diagram shows the Device class hierarchy:

User-added image

Device class

The Device class enables the configuration of an app's AR/VR modes and also stereo rendering for a Digital Eyewear devices. This configuration relies on selected ViewParameters. ViewParameters capture the physical properties of either a predefined or custom defined viewer. The set of properties available is listed in the ViewerParametersList.

The Device class also exposes the available RenderingPrimitives for a given context. These provide all the information needed to correctly configure viewport rendering for both mono or stereo viewport rendering in either Augmented Reality (AR) or Virtual Reality (VR) modes.

See: How To Use Rendering Primitives

The following class diagram shows these classes:


The device API is defined as follows:

/// Vuforia abstract representation of the Device (hardware) that it is running on
class Vuforia_API Device : private NonCopyable
{
public:
 
enum MODE
{
MODE_AR = 0,
MODE_VR
};
 
/// Get the singleton instance
static Device& getInstance();
 
/// Returns the Device class' type
static Type getClassType();
 
/// Returns the Device instance's type
virtual Type getType() const = 0;
 
/// Checks whether the Device instance's type equals or has been
/// derived from a give type
virtual bool isOfType(Type type) const = 0;
 
/// Set the rendering mode to either AR (MODE_AR) or VR (MODE_VR).
/**
* Note: It is not possible to set the mode to AR until a CameraDevice has been initialised.
*/
virtual bool setMode(MODE m) = 0;
 
/// Get the current rendering mode.
virtual MODE getMode() const = 0;
 
/// Set the currently selected viewer to active. Updates available RenderingPrimitives.
virtual void setViewerActive(bool active) = 0;
 
/// Returns true if a viewer is active, false otherwise.
virtual bool isViewerActive() const = 0;
 
/// Get the list of ViewerParameters known to the system.
virtual ViewerParametersList& getViewerList() = 0;
 
/// Select the viewer to use, either with ViewerParameters from the ViewerParametersList or CustomViewerParameters.
virtual bool selectViewer(const ViewerParameters& vp) = 0;
 
/// Returns the ViewerParameters for the currently selected viewer.
virtual ViewerParameters getSelectedViewer() const = 0;
 
/// Returns a copy of the RenderingPrimitives for the current viewer and MODE.
/**
* Note: For AR MODE the RenderingPrimitives will not be valid until a CameraDevice has been initialised.
*/
virtual const RenderingPrimitives getRenderingPrimitives() = 0;
 
};

Working with viewer parameters

Vuforia Engine supports a variety of popular VR Viewers, such as Cardboard and Gear VR. To further improve viewer support, Vuforia Engine 5.5 introduces the ViewerParameters class which captures the physical dimensions and optical characteristics of these Viewers, along with a description of their supported interaction methods. The ViewerParameters class enables developers to declare which viewer type their app is using and to then automatically configure their app's rendering and interactive behavior for that viewer types.

To select a given viewer and utilize it's ViewerParameters, you'll need to query the Device class to obtain a ViewerParametersList and then set the selected viewer as active on the Device.

See: Configuring Viewer Parameters


ViewerParametersList API

/// ViewerParametersList class
/**
* The interface to the list of ViewerParameters that can be selected.
* The list implements STL-like iterator semantics.
*/
class Vuforia_API ViewerParametersList : private NonCopyable
{
public:
/// Get the list of all supported Vuforia Viewers for authoring tools
/**
* Intended only for use in authoring tools (e.g. Unity)
* To get the list of viewers in a Vuforia app you should use
* Device.getViewerList().
*/
static ViewerParametersList& getListForAuthoringTools();
 
/// Set a filter for a 3rd party VR SDK
/**
* Allows the list to be filtered for a specific 3rd party SDK.
* Known SDKs are "GEARVR" and "CARDBOARD".
* To return to the default list of viewers set the filter to the empty string.
*/
virtual void setSDKFilter(const char* filter) = 0;
 
/// Returns the number of items in the list.
virtual size_t size() const = 0;
 
/// Returns the item at the specified index. NULL if the index is out of range.
virtual const ViewerParameters* get(size_t idx) const = 0;
 
/// Returns ViewerParameters for the specified viewer name and manufacturer. NULL if no viewer was matched.
virtual const ViewerParameters* get(const char* name,
const char* manufacturer) const = 0;
 
/// Returns a pointer to the first item in the list.
virtual const ViewerParameters* begin() const = 0;
 
/// Returns a pointer to just beyond the last element.
virtual const ViewerParameters* end() const = 0;
 
};

The ViewerParameters class captures all of the parameters defined for a given Viewer.

/// ViewerParameters class
/**
* Container class for parameters needed to define a VR Viewer
*/
class Vuforia_API ViewerParameters
{
public:
 
enum BUTTON_TYPE
{
BUTTON_TYPE_NONE = 0,
BUTTON_TYPE_MAGNET,
BUTTON_TYPE_FINGER_TOUCH,
BUTTON_TYPE_BUTTON_TOUCH,
};
enum TRAY_ALIGNMENT
{
TRAY_ALIGN_BOTTOM = 0,
TRAY_ALIGN_CENTRE,
TRAY_ALIGN_TOP,
};
 
virtual ~ViewerParameters();
 
/// Copy constructor
ViewerParameters(const ViewerParameters &);
/// Assignment operator
ViewerParameters& operator= (const ViewerParameters &);
 
/// Returns the version of this ViewerParameters.
virtual float getVersion() const;
 
/// Returns the name of the viewer.
virtual const char* getName() const;
 
/// Returns the manufacturer of the viewer.
virtual const char* getManufacturer() const;
 
/// Returns the type of button in the viewer.
virtual BUTTON_TYPE getButtonType() const;
 
/// Returns the distance between the phone screen and the viewer lens' (millimeters)
virtual float getScreenToLensDistance() const;
 
/// Returns the distance between the viewer lens' (millimeters)
virtual float getInterLensDistance() const;
 
/// Returns how the phone sits within the viewer
virtual TRAY_ALIGNMENT getTrayAlignment() const;
 
/// Returns the distance between the lens' and the tray position. (millimeters)
virtual float getLensCentreToTrayDistance() const;
 
/// Returns the number of distortion coefficients specified for the viewer lens'
virtual size_t getNumDistortionCoefficients() const;
 
/// Returns the distortion coefficient at the specified index, 0 if index is out of range.
virtual float getDistortionCoefficient(int idx) const;
 
/// Get field-of-view of the lens'.
/**
* \return a Vector containing the half angles in order
* Outer (ear), Inner (nose), top, bottom
*/
virtual Vec4F getFieldOfView() const;
 
/// Returns true if the viewer contains a magnet, false otherwise.
virtual bool containsMagnet() const;
 
protected:
/// To construct ViewerParameters please use CustomViewerParameters,
/// objects of this type are read-only
ViewerParameters();
 
class Data;
Data* mData;
 
};

Developers who want to define their own ViewerParameters can construct a CustomViewerParameters instance. The CustomViewerParameters class extends ViewerParameters and adds setters for the values.

/// Editable container class for parameters needed to define a VR Viewer
class Vuforia_API CustomViewerParameters : public ViewerParameters
{
public:
/// Construct an empty object for the supplied version, name and manufacturer.
CustomViewerParameters(float version, const char* name, const char* manufacturer);
/// Copy constructor
CustomViewerParameters(const CustomViewerParameters&);
 
/// Assignment operator
CustomViewerParameters& operator=(const CustomViewerParameters&);
 
/// Set the type of button in the viewer
virtual void setButtonType(BUTTON_TYPE val);
 
/// Set the distance between the phone screen and the viewer lens' (millimeters)
virtual void setScreenToLensDistance(float val);
 
/// Set the distance between the viewer lens' (millimeters)
virtual void setInterLensDistance(float val);
 
/// Set how the phone sits with in the viewer
virtual void setTrayAlignment(TRAY_ALIGNMENT val);
 
/// Set the distance between the lens' and the tray position. (millimeters)
virtual void setLensCentreToTrayDistance(float val);
 
/// Clear the list of distortion coefficients
virtual void clearDistortionCoefficients();
/// Add a new value to the list of distortion coefficients
virtual void addDistortionCoefficient(float val);
 
/// Set the field-of-view of the lens'
/**
* \param val a Vector containing the half angles in order Outer (ear), Inner (nose), top, bottom
*/
virtual void setFieldOfView(const Vec4F& val);
 
/// Set the flag indicating whether the viewer contains a magnet.
virtual void setContainsMagnet(bool val);
 
};

Example: Configure Device for Mono AR

// By default Device will be setup for Mono AR
 
// Perform Vuforia::CameraDevice initialisation as normal
 
if (!Vuforia::Device::getInstance().setMode(MODE_AR))
{
// If this fails something wasn’t setup properly and AR mode can’t be initialised
}

Example: Configure Device for AR on a viewer

//
// Populate a UI list with build-in viewers
//
// Populate UI with available build-in viewers
Vuforia:ViewerParametersList& viewers = Vuforia::Device::getInstance().getViewerList();
for (int i = 0; i < viewers.size(); ++i)
{
const Vuforia::ViewerParameters* params = viewers.get(i);
// Add to UI
// params->getName();
// params->getManufacturer();
}
 
//
// Select viewer during Vuforia initialisation
//
// UI returns index of ViewerParameters to use
if (!Vuforia::Device::getInstance().selectViewer(viewers.get(selectedIndex)))
{
// Error selecting viewer. STOP
}
Vuforia::Device::getInstance().setViewerActive(true);
 
// Perform Vuforia::CameraDevice initialisation as normal
 
if (!Vuforia::Device::getInstance().setMode(MODE_AR))
{
// If this fails something wasn’t setup properly and AR mode can’t be initialised
}

Example: Configure Device for VR on a viewer

//
// Populate a UI list with build-in viewers
//
// Populate UI with available build-in viewers
Vuforia::ViewerParametersList& viewers = Vuforia::Device::getInstance().getViewerList();
for (int i = 0; i < viewers.size(); ++i)
{
const Vuforia::ViewerParameters* params = viewers.get(i);
// Add to UI
// params->getName();
// params->getManufacturer();
}
 
//
// Select viewer during Vuforia initialisation
//
// UI returns index of ViewerParameters to use
if (!Vuforia::Device::getInstance().selectViewer(viewers.get(selectedIndex)))
{
// Error selecting viewer. STOP
}
Vuforia::Device::getInstance().setViewerActive(true);
 
if (!Vuforia::Device::getInstance().setMode(MODE_VR))
{
// If this fails something wasn’t setup properly and VR mode can’t be initialised
}

EyewearDevice class

The EyewearDevice class replaces the Eyewear class found in prior versions of the SDK; however, this class is now only used for dedicated optical see-through devices such as the Epson BT-200, ODG R-6, and ODG R-7. Developers creating apps for VR Viewers such as Cardboard, Zeiss VR ONE, Merge VR, and Gear VR will not use the EyewearDevice class.

Note: The EyewearCalibrationProfileManager and EyewearUserCalibrator are now retrieved from EyewearDevice rather than the Eyewear class.

EyewearDevice API:

/// Specialization of Device which is provided when Vuforia is running on a dedicated Eyewear device.
class Vuforia_API EyewearDevice : public Device
{
public:
enum ORIENTATION
{
ORIENTATION_UNDEFINED = 0,
ORIENTATION_PORTRAIT,
ORIENTATION_LANDSCAPE_LEFT,
ORIENTATION_LANDSCAPE_RIGHT
};
 
/// Returns the EyewearDevice class' type
static Type getClassType();
 
/// Returns true if the Eyewear device detected has a see-through display.
virtual bool isSeeThru() const = 0;
 
/// Returns true if the Eyewear device has a display for each eye (i.e. stereo), false otherwise.
virtual bool isDualDisplay() const = 0;
 
/// Switch between 2D (duplication/mono) and 3D (extended/stereo) modes on eyewear device.
/**
* \param enable set to true to switch to 3D (stereo) mode or false for 2D (mono) mode
* \return true if successful or false if the device doesn't support this operation.
*/
virtual bool setDisplayExtended(bool enable) = 0;
 
/// Returns true if the Eyewear device display is extended across each eye
virtual bool isDisplayExtended() const = 0;
 
/// Returns true if the Eyewear device dual display mode is only for OpenGL content.
/**
* Some Eyewear devices don't support stereo for 2D (typically Android widget)
* content. On these devices 2D content is rendered to each eye automatically
* without the need for the app to create a split screen view. On such devices
* this method will return true.
*/
virtual bool isDisplayExtendedGLOnly() const = 0;
 
/// Returns the correct screen orientation to use when rendering for the eyewear device.
virtual ORIENTATION getScreenOrientation() const = 0;
 
/// Turn predictive tracking on or off
/**
* Predictive tracking uses device sensors to predict user motion and reduce perceived latency.
* By default predictive tracking is enabled on devices that support this enhancement.
* \param enable set to true to enable predictive tracking or false to disable predictive tracking.
* \return true if successful or false if the device doesn't support this operation.
*/
virtual bool setPredictiveTracking(bool enable) = 0;
 
/// Returns true if predictive tracking is enabled
virtual bool isPredictiveTrackingEnabled() const = 0;
 
/// Get the calibration profile manager.
/**
* Note: Calibration profiles are only relevant to see-through Eyewear devices.
* \return A reference to the calibration profile manager.
*/
virtual EyewearCalibrationProfileManager& getCalibrationProfileManager() = 0;
 
/// Gets the calibrator used for creating custom user calibration experiences for see-thru eyewear.
/**
* \return A reference to the calibrator object
*/
virtual EyewearUserCalibrator& getUserCalibrator() = 0;
 
};


Note: When migrating from the Vuforia Engine 5.0 code using the Eyewear API, the following method name changes are required:

Eyewear method New EyewearDevice method
isStereoCapable isDualDisplay
isStereoEnabled isDisplayExtended
setStereo setDisplayExtended
isStereoGLOnly isDisplayExtendedGLOnly
getProfileManager getCalibrationProfileManager

VR specific Rendering API

Vuforia Engine 5.5 enables the development of immersive Virtual Reality experiences for mobile devices, such as phones and tablets, that are mounted in VR stereo viewers like Cardboard and Gear VR. These new APIs allow you to render a stereoscopic VR scene on a device screen using two new classes: RenderingPrimitives and Views. RenderingPrimitives configure viewport rendering and lens correction and Views identify which viewport and render pass should apply a given RenderingPrimitive.

These new rendering APIs are designed to enable sophisticated mixed reality AR/VR applications that are portable across a variety of device and VR viewer contexts.

RenderingPrimitives, ViewList and Views

The available Views are represented in the following new enum:

enum VIEW { VIEW_SINGULAR, ///< Identifier for singular screen on a mobile phone or /// tablet, or the full display in a VR viewer VIEW_LEFTEYE, ///< Identifier for the left display of an HMD, or the /// left side of the screen when docked in a VR viewer VIEW_RIGHTEYE, ///< Identifier for the right display of an HMD, or the /// right side of the screen when docked in a VR viewer VIEW_POSTPROCESS, ///< Identifier for the post processing step of VR /// rendering where the distorted scene is rendered to /// the screen VIEW_COUNT ///< Max possible number of views };

Developers set the Device mode and Viewer - Vuforia Engine will then calculate the views required for this mode, and the RenderingPrimitives needed for those views.

Usage

The following example shows how the ViewList and RenderingPrimitives can be used to enable the same rendering loop to be employed for mono AR, stereo AR, mono VR and stereo VR cases.

Vuforia::ViewList& viewList = renderingPrimitives->getRenderingViews(); // The 'postprocess' view is a special one that indicates that a // distortion postprocess is required. If this is present, then we need // to prepare an off-screen buffer to support the distortion if (viewList.contains(Vuforia::VIEW_POSTPROCESS)) { Vuforia::Vec2I textureSize = renderingPrimitives->getDistortionTextureSize(Vuforia::VIEW_POSTPROCESS); … prepare off screen buffer for rendering into … } // Iterate over the ViewList for (size_t viewIdx = 0; viewIdx < viewList.getNumViews(); viewIdx++) { Vuforia::VIEW vw = viewList.getView(viewIdx); // Any post processing is a special case that will be completed after // the main render loop - so does not imply any rendering here if (vw == Vuforia::VIEW_POSTPROCESS) { continue; } // Set up the viewport Vuforia:Vec4I viewport; if (distortForViewer) { // We're doing distortion via an off-screen buffer, so the viewport // is relative to that buffer viewport = renderingPrimitives->getDistortionTextureViewport(vw); } else { // We're writing directly to the screen, so the viewport is relative to // the screen viewport = renderingPrimitives->getViewport(vw); } … OpenGL or Metal for setting the viewport … if (drawVideo) { renderingPrimitives->getVideoBackgroundProjectionMatrix(vw, Vuforia::COORDINATE_SYSTEM_CAMERA)); const Vuforia::Mesh& vbMesh = renderingPrimitives->getVideoBackgroundMesh(vw); … render video background using OpenGL or Metal … } // Retrieve the projection matrix to use for the augmentation Vuforia::Matrix34F projectionMatrix = renderingPrimitives->getProjectionMatrix(vw, Vuforia::COORDINATE_SYSTEM_CAMERA); // Get the eye admustment to apply based on the eye position Vuforia::Matrix34F eyeAdjustmentGL = renderingPrimitives->getEyeDisplayAdjustmentMatrix(vw); … iterative over trackables and render augmentation … } // As a final step, perform the viewer distortion if required if (distortForViewer) { Vuforia::Vec4I screenViewport = renderingPrimitives->getViewport(Vuforia::VIEW_POSTPROCESS); const Vuforia::Mesh& distoMesh = renderingPrimitives->getDistortionTextureMesh(Vuforia::VIEW_POSTPROCESS); … perform the viewer distortion … }

Impact on existing Vuforia Engine apps:

None of Vuforia Engine's existing rendering APIs have been deprecated in 5.5 as a result of these new APIs.

Use of RenderingPrimitives is strongly encouraged to enable apps to be easily ported across mobile phones, drop-in eyewear, and dedicated see-through eyewear.

Note: The drawVideoBackground() method now uses a different mechanism to compute viewport parameters. This change was implemented to enable support for both iOS Metal and OpenGL. Any existing Vuforia Engine apps that call drawVideoBackground() will need to set their own viewport before rendering an augmentation.

 

Rotational Device Tracker and Head Tracking

Vuforia Engine now supports rotational head tracking to enable the development of immersive Virtual Reality (VR) experiences. This functionality allows the creation of apps that can track the rotation of a user's head, when using a VR viewer, and also supports handheld device tracking.

Rotational Tracking is enabled by the RotationalDeviceTracker class, which is a a specialized Tracker type that reports the rotational pose of a handheld or head mounted device. The Rotational Device Tracker can also be configured with a rotational pivot offset to adjust the pose values to coincide with the center of rotation of a user's head or arm.

NOTE: Rotational Device Tracker is no longer supported from Vuforia Engine SDK 8.5. Instead refer to the Positional Device Tracker article for 6dof tracking on eyewear devices. 

User-added image
 

The RotationalDeviceTracker class:

/**
*  The RotationalDeviceTracker tracks a device in the world by relying on
*  sensor tracking. The RotationalDeviceTracker publishes device trackable
*  result. Device Trackable results are in world coordinate system 
*  and use a physical unit (meter).
*  A rotational device tracker can use model correction to improve the 
*  returned pose based on the context usage (e.g. on the  head for doing 
*  head tracking,  holding device in your hands for handheld tracking, etc). 
*  This tracker also supports a pose prediction mode to improve the quality 
*  of returned pose. You should only use this mode in VR configuration!
*/

class Vuforia_API RotationalDeviceTracker : public DeviceTracker
{
public:

    /// Returns the Tracker class' type
    static Type getClassType();

    /// Reset the current pose.
    /**
    *  Reset the current pose heading in the world coordinate system.
    *  Useful if you want to reset the direction the device is pointing too
    *  without impacting the current pitch or roll angle (your horizon).
    */
    virtual bool recenter() = 0;

    /// Return true if model correction mode is supported
    /**
    *  Enable pose prediction to improve tracking position.
    *  Recommended to use this mode for VR experience.
    */
    virtual bool setPosePrediction(bool mode) = 0;

    // Get the current pose prediction mode
    /**
    *  by default prediction is off.
    */
    virtual bool getPosePrediction() const = 0;

    /// Enable usage of a model correction for the pose
    /**
    *  Specify a correction mode of the returned pose.
    *  Correction mode are based on transformation model, defining the context
    *  usage of the tracker.
    *  For example, if you device tracker for doing head tracking (VR), you
    *  can set a HeadTransformModel to the tracker and pose will be adjusted 
    *  consequently. The rotational device tracker support two transform models:
    *  - HeadTransformModel: for head tracking (VR, rotational AR experience)
    *  - HandheldTransformModel: for handheld tracking.
    *  by default no transform model is used.
    *  Passing NULL as argument disable the usage of the model correction.
    */
    virtual bool setModelCorrection(const TransformModel* transformationmodel) = 0;

    /// Get the current correction model
    /**
    *  return the currently set transform model used for correction.
    *  by default no transform model are used, will return to null.
    */
    virtual const TransformModel* getModelCorrection() const = 0;

    /// Return the default head transform model
    /**
    *  utility method to get the recommended Head model. 
    *  Unit is in meter.
    */
    virtual const HeadTransformModel* getDefaultHeadModel() const = 0;

    /// Returns the default handheld transform model
    /**
    *  utility method to get the recommended handheld model.
    *  Unit is in meter.
    */   
    virtual const HandheldTransformModel* getDefaultHandheldModel() const = 0;
};

DeviceTrackable class

This is a Trackable type specific to Device tracking.

/// DeviceTrackable class.
/**
* The DeviceTrackable defines trackable for DeviceTrackers.
*/
class Vuforia_API DeviceTrackable : public Trackable
{

public:
/// Returns the Trackable class' type
static Type getClassType();

};

DeviceTrackableResult class

This is a TrackableResult specific to DeviceTrackables.

/**
* The DeviceTrackableResult defines trackable results returned
* by DeviceTrackers.
*/
class Vuforia_API DeviceTrackableResult : public TrackableResult
{
public:

/// Returns the TrackableResult class' type
static Type getClassType();

/// Returns the corresponding Trackable that this result represents
virtual const DeviceTrackable& getTrackable() const = 0;
};

TransformModel class

The TransformModel defines which pivot offset model should be used by the RotationalDeviceTracker.

/// TransformModel class.
/**
* The TransformModel define a domain specific model
* that can be used by DeviceTracker. The Model defines
* specific transformation, representation of a tracked scenario.
*/

class Vuforia_API TransformModel : private NonCopyable
{

public:
enum TYPE {
TRANSFORM_MODEL_HEAD,
TRANSFORM_MODEL_HANDHELD,
INVALID
};

virtual TYPE getType() const = 0;

virtual ~TransformModel();

private:
TransformModel& operator=(const TransformModel& other);
};

HeadTransformModel class

This is a specialized TransformModel configured for rotational tracking about the neck.

/// HeadTransformModel class.
/**
* The HeadTransformModel define a head model that can be mainly
* used for 3DOF tracker (rotation only) while used in a head tracking
* context. It supports a pivot model, representing the neck pivot
* point in reference to the tracked pose that can be use to correct
* the pose provided by the tracker.
* The pivot point (3d vector) will be used to correct the current
* estimated rotation, to take in consideration current rotation point.
* For a head model this corresponds to the neck pivot.
* The default value used is based on average anthropomorphic
* measurements.
*/

class Vuforia_API HeadTransformModel : public TransformModel
{

public:
/// Returns the TransformModel instance's type
virtual TYPE getType() const;

/// Constructor.
HeadTransformModel();

/// Copy constructor.
HeadTransformModel(const HeadTransformModel& other);

/// Define a Head Transform Model with a pivot point
HeadTransformModel(const Vec3F& pivotPos);

/// Set the Pivot Point
virtual bool setPivotPoint(const Vec3F& pivot);

/// Get the Pivot Point
virtual Vec3F getPivotPoint() const;

/// Destructor
virtual ~HeadTransformModel();

protected:
Vec3F pivotPosition;
};

HandheldTransformModel class

This is a specialization of the TransformModel configured to accommodate rotational about a multi-jointed pivot point.

/// HandheldTransformModel class.
/**
*  The HanheldTransformModel define a handheld model that can be mainly
*  used for 3DOF tracker (rotation only) while used in a mobile tracking
*  context. Mobile tracking context corresponds to scenario when user
*  move a mobile device around its body, mainly executing a 3d motion
*  that can be fitted to a spherical model.
*  It supports a pivot model, representing the center of rotation,
*  when moving the device around, reference for the tracked pose 
*  that can be use to correct the pose provided by the tracker. 
*  The pivot point (3d vector) will be used to correct the current
*  estimated rotation, to take in consideration current rotation point.
*  For a handheld model this corresponds to the body pivot.
*  The default value used is based on average anthropomorphic 
*  measurements.
*/
class Vuforia_API HandheldTransformModel : public TransformModel
{
public:
    /// Returns the TransformModel instance's type
    virtual TYPE getType() const;
 
    /// Constructor.
    HandheldTransformModel();
 
    /// Copy constructor.
    HandheldTransformModel(const HandheldTransformModel& other);
 
    /// Define a Head Transform Model with a pivot point
    HandheldTransformModel(const Vec3F& pivotPos);
 
    /// Set the Pivot Point
    virtual bool setPivotPoint(const Vec3F& pivot);
 
    // Get the Pivot Point
    virtual Vec3F getPivotPoint() const;
    
    // Destructor
    virtual ~HandheldTransformModel();
 
protected:
    Vec3F pivotPosition;
};

Metal Support

Apple's Metal rendering API is now supported for iOS devices running iOS 8 and above when using Vuforia Engine 5.5 and above. See: Support for iOS Metal The METAL initialization flag is defined in the Vuforia.h header file.

enum INIT_FLAGS {
GL_11 = 1, ///< Enables OpenGL ES 1.1 rendering
GL_20 = 2, ///< Enables OpenGL ES 2.0 rendering
METAL = 4, ///< Enables Metal rendering, available on Apple platforms
};

In order to utilize the Metal API in your app, you will need to pass the METAL INIT_FLAGS value to the SDK using the Vuforia::setInitParameters method.

Vuforia::setInitParameters(Vuforia::METAL,"");

Note that Vuforia::setInitParameters() must be called before calling Vuforia::init() to commence with the initialization of Vuforia Engine. The correct call sequence for initializing your app to use Metal is demonstrated in the ImageTargetsMetal Advanced Topics sample project for iOS. Metal API support can be configured by assigning the mVuforiaInitFlags variable the value Vuforia::METAL on line 144 in SampleApplicationSession.mm.

Vuforia::setInitParameters(mVuforiaInitFlags,""); 

mVuforiaInitFlags can be defined when calling initAR or later in initVuforiaInBackground in the sample code.

Rendering API changes

Vuforia Renderer APIs have been updated to use an abstract base class for underlying rendering APIs (e.g. OpenGL ES vs Metal ).

class Vuforia_API Renderer : private NonCopyable
{

public:
...

/// Marks the beginning of rendering for the current frame and returns the
/// State object.
/**
* Returns the latest available State object. This state object may hold
* predicted Trackable poses if predicted tracking is turned on and
* available on the device. Please see Eyewear::setPredictiveTracking()
* for more details. Must only be called from the render thread.
*
* renderData is a pointer to 3D graphics rendering API-specific data, which
* is not required for all APIs (such as OpenGL ES).
*/
virtual State begin(const RenderData* renderData = 0) = 0;

/// Marks the beginning of rendering for the given frame.
/**
* Use this to draw a specific camera frame, rather than the latest
* available one. Must only be called from the render thread.
*
* renderData is a pointer to 3D graphics rendering API-specific data, which
* is not required for all APIs (such as OpenGL ES).
*/
virtual void begin(State state, const RenderData* renderData = 0) = 0;

/// Marks the end of rendering for the current frame.
/**
* Must only be called from the render thread.
*
* renderData is a pointer to 3D graphics rendering API-specific data, which
* is not required for all APIs (such as OpenGL ES).
*/
virtual void end(const RenderData* renderData = 0) = 0;
/// Updates the video background texture and optionally attaches it to a
/// given GPU texture unit.
/**
* textureUnit is a pointer to a 3D graphics rendering API-specific
* identifier (GL texture unit or Metal texture index). Pass NULL if you
* do not wish Vuforia to attach the updated texture.
*
* This should only be called between a begin() and end() calls.
* Must only be called from the render thread.
*/
virtual bool updateVideoBackgroundTexture(const TextureUnit* textureUnit = 0) = 0;

/// Passes texture data to Vuforia to use when updating video background
/**
* Use this in conjunction with updateVideoBackgroundTexture. Must only be
* called from the render thread.
*
* textureData is a reference to 3D graphics rendering API-specific texture
* data, such as GLTextureData.
*/
virtual bool setVideoBackgroundTexture(const TextureData& textureData) = 0;

...
}

Note: The bindVideoBackground method has been changed to updateVideoBackgroundTexture to more accurately represent the function's role.

Changes to core APIs

CameraDevice changes

In CameraDevice.h’s CameraDevice class, the enum CAMERA has been changed to to enum CAMERA_DIRECTION to more accurately reflect its purpose and harmonize this enum name with the Unity API.

enum CAMERA_DIRECTION
{
    CAMERA_DIRECTION_DEFAULT,      ///< Camera direction representing the default    
                                   ///  camera on the current device. Used only for
                                   ///  camera initalization.
    CAMERA_DIRECTION_BACK,         ///< The camera is facing in the opposite direction
                                   ///  as the screen
    CAMERA_DIRECTION_FRONT,        ///< The camera is facing in the same direction as  
                                   ///  the screen
};

Consequently, CameraDevice initialization has been changed to use this new enum type.
Another change is the addition of a utility function to query the direction of an initialized CameraDevice:

virtual CAMERA_DIRECTION getCameraDirection() = 0;

These changes are now reflected in the CameraDevice class:

/// Implements access to the phone's built-in camera
class Vuforia_API CameraDevice : private NonCopyable
{
public:
    ...
    enum CAMERA_DIRECTION
    {
        CAMERA_DIRECTION_DEFAULT,  ///< Camera direction representing the default   
                                   ///  camera on the current device. Used only for
                                   ///  camera initalization.
        CAMERA_DIRECTION_BACK,     ///< The camera is facing in the opposite direction
                                   ///  as the screen
        CAMERA_DIRECTION_FRONT,    ///< The camera is facing in the same direction as 
                                   ///  the screen
    };
 
    /// Initializes the camera.
    virtual bool init(CAMERA_DIRECTION camera = CAMERA_DIRECTION_DEFAULT) = 0;
 
    /// Returned the camera's direction:
    virtual CAMERA_DIRECTION getCameraDirection() = 0;
    ...
};

Const correctness

Several functions have been refactored to const functions to enable better compiler optimization and to indicate that they do not change any internal SDK elements or allow such changes.

Revised functions:

/// CameraDevice.h virtual int getNumVideoModes() const = 0; virtual VideoMode getVideoMode(int nIndex) const = 0; virtual CAMERA_DIRECTION getCameraDirection() const = 0; /// In EyewearUserCalibrator.h virtual float getMinScaleHint() const = 0; virtual float getMaxScaleHint() const = 0; virtual bool isStereoStretched() const = 0; /// In ImageTargetBuilder.h virtual FRAME_QUALITY getFrameQuality() const = 0; /// In TargetFinder.h virtual int getInitState() const = 0; virtual bool isRequesting() const = 0;

Trademark and legacy reference updates

References to Qualcomm have been revised throughout the SDK libraries:

  • The old name “QCAR” changes to “Vuforia” in:

    • the SDK’s namespace

    • macros

    • include directory paths

    • function names

  • The Qualcomm-specific copyright header in all include and license files is updated to reflect new PTC ownership and match legal guidance to respect Qualcomm copyright