Tracking information on the position and orientation of the device in the real world is provided by the Device Pose Observer. It is computed from camera frames of the environment and from sensor measurements. Use the Device Pose Observer in your application for tracking the device in relation to the real world. Device Tracking uses Vuforia Fusion to detect and utilize the platform’s native tracker or it will use Vuforia’s sensor-fusion technology.
Device Pose Observer
The Device Pose Observer tracks the device with respect to the environment in meters. It can be configured and created from the Vuforia Engine and returns information on observation pose status and status info. See Device Tracking API Overview for more information on the API. See also Device Tracking in Unity for instructions on enabling device tracking in a Unity Project.
Device Pose and Vuforia Fusion
Device Tracking will leverage Vuforia Fusion to detect what type of platform the application is running on. By doing so, the Device Pose Observer can benefit from the ARKit, ARCore, VISLAM or SLAM technologies to track the device’s position in relation to its environment. See more detailed information on Vuforia Fusion.
The Device Pose Observer also enables extended tracking for all Vuforia Target types and is required for Ground Plane and Area Target tracking. While other Observers track images or objects, the Device Pose Observer uses visual details of the environment in the camera, and in some cases, the built-in inertial measurement unit (IMU) sensor, to determine the six-degree-of-freedom pose of the device itself.
Device Tracking dependent Vuforia features
Some Vuforia target types are dependent on the device pose and cannot function without creating and initializing it. This is the case for Area Targets and Ground Plane:
Area Targets are tracked using the Area Target Observer which only reports observations when there is an active Device Pose Observer. Area Targets are built from scanned spaces and are ideal for tracking small rooms to large spaces. For more information, please see Area Target API Overview.
The Ground Plane feature utilizes the Device Pose Observer for maintaining a position and orientation in the world space for the device and for anchors. The device pose information allows you to locate and track horizontal surfaces and creating anchor points on surfaces and in mid-air. Visit Ground Plane User Guide for more information.
Model Targets can work without device poses, but it is highly recommended to enable it as it improves and stabilizes tracking of the object.
Continuous tracking allows users to resume an AR session from an interrupt or pause by using stored poses from the Device Pose Observer. See Continued AR Experiences for details and platform dependent behavior.
In some scenarios, the device may be statically mounted (e.g. on a tripod) instead of being handheld and in motion. In such scenarios, there is a risk that the Device Pose Observer will not be able to initialize due to lack of device motion.
Such a scenario could be at assembly lines where devices are fixed on a stand while providing instructions to the worker. Or it could be a standing tablet that track toys or board games from a single position. To enable device tracking on stationary devices, you can set a static hint to the Device Pose Observer. See Device Tracking API Overview and Enable the Track Device Pose in Unity for details
Extended Tracking is the concept that a target's pose information will be available even when the target is no longer in the field of view of the camera, is occluded or cannot directly be tracked for other reasons. Extended Tracking utilizes the device pose to improve tracking performance and sustain tracking even when the target is no longer in view.
In practice this means that after you point your device away from the initial target, any augmentations maintain their positions with respect to the real world and are consistent with the initial reference frame defined by the target. The more detailed and feature-rich the environment, the better Extended Tracking works. Area Targets and Ground Plane’s anchors are always using Extended Tracking.
A depiction of how augmentations and tracking is sustained even when moving the camera view away from the Image Target.
While observing a target, users will likely find a position where tracking is limited or even lost. To accommodate for the different states the success of tracking a target can be in, different tracking statuses are reported to the state. Retrieve the status pose and status info from the state and improve the AR experience by, for example, notifying users to return to a better viewing position when the tracking quality is degraded. See Status Poses and Status Info for detailed information.
Also, there may be situations where the AR tracking is interrupted by for instance a call during the experience and the application is paused. To resume the experience, the application can relocalize itself if the duration and the movement between pause and resume was not too excessive. See Best Practices for Continued AR Experiences for more information.