Developing Vuforia Engine Apps for HoloLens

Overview

Vuforia Engine enhances the capability of HoloLens by allowing you to connect AR experiences to specific images and objects in the environment. You can use this capability to overlay step by step instructions on top of machinery or to add digital features to a physical product.

Enterprise developers can use VuMarks to uniquely identify each piece of machinery on a factory floor, right down to the serial number. Additionally, VuMarks can be scaled into the billions and designed to look just like a company logo. Based on the CAD data of your product or machinery the Hololens can detect these 3D objects robustly in the environment using Vuforia Model Targets technology. Augmented overlays can be authored directly against the CAD model so that your field workers see them right where you placed them.

Existing Vuforia Engine apps built for phones and tablets can be configured in Unity to run on HoloLens. You can even use Vuforia Engine to take your new HoloLens app to Windows 10 tablets such as the Surface Pro 4 and Surface Book.

Vuforia Image

The Role of Extended Tracking

Extended Tracking maintains tracking even when a target is no longer in view. It is automatically enabled for all targets when the Positional Device Tracker is enabled. For HoloLens applications, the Positional Device Tracker is started automatically in Unity, in native applications you need to be start it in every app.

When you enable Extended Tracking on a target, you allow the pose of that target to be passed to the HoloLens Spatial Mapping engine. In this way, targets can exist in both the Vuforia Engine and HoloLens spatial coordinate systems, though not simultaneously.

Vuforia Engine automatically transforms the pose of a target into the HoloLens spatial coordinate system.  This allows HoloLens to take over tracking and to integrate any content augmenting the target into the spatial map of the target’s surroundings.

This process occurs between Vuforia Engine and HoloLens APIs in Unity. Since the process is handled automatically, it does not require any programming by the developer.

 The following is a high level description of the process:

  1. Vuforia Engine’s Tracker recognizes the target.
  2. Target tracking is then initialized.
  3. The position and rotation of the target are analyzed to provide a robust pose estimate for HoloLens.
  4. Vuforia Engine transforms the target's pose into the HoloLens spatial mapping coordinate space.
  5. HoloLens takes over tracking if the target is no longer in view. Whenever you look again at the target, Vuforia will continue to track the images and objects accurately.

Targets that are detected, but no longer in view, are reported as EXTENDED_TRACKED. In these cases, the DefaultTrackableEventHandler script that is used on all targets continues to render augmentation content. The developer can control this behavior by implementing a custom trackable event handler script.

Developing a Vuforia app for HoloLens

The best way to understand the structure of a Vuforia Engine HoloLens project is by installing and building the Vuforia HoloLens sample project.  The sample provides a complete HoloLens project that includes pre-configured deployable scenes and project settings. Running the project will provide you a starting point and reference for your own Vuforia HoloLens apps.