Working with the HoloLens Sample in Unity

This article demonstrates how to use the Vuforia HoloLens Sample project to author a Vuforia Engine scene for HoloLens and customize event handling to implement unique app behaviors. The sample project contains pre-configured Unity scenes and project settings that act as a starting point and reference for your own apps. For additional background information on working with HoloLens, refer to the Developing Vuforia Apps for HoloLens article.

The supported versions page lists the operating systems, tools, and device versions supported by Vuforia Engine.

Getting Started with Visual Studio and Unity

  1. Ensure that the minimum-supported Visual Studio and Unity versions are installed (refer to the Vuforia Engine Supported Versions page).
    Note: When installing Unity, make sure to install the IL2CPP Scripting Backend.  
  2. Configure the Unity Editor to use Visual Studio as the default IDE.
  3. Install Visual Studios tools for Unity
  4. Optional: Apply the recommended performance options to Unity Engine.

For more information on setting up your Windows 10 build environment, refer to the Developing for Windows 10 in Unity article. 

Getting Started with the HoloLens Sample

Importing the sample

  1. Select your Unity version:
  2. For... Then…
    Unity 2019.1 or earlier Install the latest version of the Vuforia Engine.
    Unity 2019.2 or later The sample automatically references the correct Vuforia Engine available in Unity’s Package Manager.
  3. Create a new Unity project.
  4. In the Window menu, select Asset Store.
  5. In the search bar, enter Vuforia HoloLens.
  6. Select Vuforia HoloLens Sample.
  7. Click Import.
Vuforia Image

Building and executing the sample

  1. In the File menu, select Build Settings.
  2. In the Platform section, select Universal Windows Platform and click Switch Platform.
  3. From the Target Device dropdown, select HoloLens.
  4. From the Build Type dropdown, select D3D Project.
  5. From the Target SDK Version dropdown, select Latest Installed.
Vuforia Image
  1. Click the Player Settings... button.
  2. Select the UWP icon and expand the XR Settings section.
  3. Ensure that Virtual Reality Supported is selected. 
  4. Under Virtual Reality SDKs ensure that Window Mixed Reality is included in the list and that Enable Depth Buffer Sharing is selected. 
  5. Ensure that the Vuforia Augmented Reality Supported checkbox is enabled.
  6. Expand the Publishing Settings section.
  7. Under Capabilites ensure that Internet ClientWebCam, Microphone, and SpatialPerception are selected.
    Note: SpatialPerception should only be selected if you intend to use the Surface Observer API.
  8. Under Supported Device Families ensure that Holographic is selected. 
  9. Expand the Resolution and Presentation section.
  10. Disable Run in Background so that Vuforia pauses when the app is put into the background and can access the camera when the app is resumed.
  11. In the Default Orientation dropdown, ensure that Landscape Left is selected.
  12. Close the Project Settings window.
  13. In the Hierarchy window,  select the scene and remove the Main Camera.
  14. In the Game Object menu, select Vuforia Engine > AR Camera.
  15. In the Build Settings window, click Build to generate a Visual Studio project.
  16. In the Windows Explorer dialog that appears, create a new folder to hold Unity's build output. Generally, we name the folder "App".
  17. Select the newly created folder and click Select Folder.
  18. Once Unity has finished building, a Windows Explorer window opens to the project root directory. Navigate into the newly created folder.
  19. Open the generated Visual Studio solution file located inside this folder.
    The project opens in Visual Studio.
  20. In the Solution Platforms dropdown, select x86.

Vuforia Image

For more information, refer to the Exporting and building a Unity Visual Studio solution tutorial.

Scene Elements and Configuration

The HoloLens sample uses the ARCamera, ImageTarget, and ModelTarget prefabs. These prefabs are located in the project's Vuforia folder along with the other assets and resources in this sample.

The sample scene hierarchy demonstrates how to set up a Vuforia Engine scene for HoloLens in Unity. You can easily substitute your own content in this scene to create a unique HoloLens app.

Vuforia Image

Scene Hierarchy

ARCamera GameObject

The ARCamera is the scene camera for Vuforia Engine apps in Unity. The ARCamera defines the properties of both of its child scene cameras, as well as the device camera and rendering behavior of the scene.

In the Vuforia Engine Behaviour script, the World Center Mode dropdown defines which object in the scene hierarchy serves as the world origin (0,0,0) of the scene's world space. In HoloLens, only the Device option is supported. 

Starting with version 8.5, Vuforia Engine automatically detects if an app is running on HoloLens in Unity. It is no longer necessary to configure the Digital Eyewear settings in the Vuforia Configuration window.

The Voice Commands script defines various voice commands available in the sample. Voice commands use the HoloLens Keyword Recognizer.

Vuforia Image

ImageTarget GameObjects

The ImageTarget GameObject encapsulates the ImageTargetBehavior and DefaultTrackableEventHandler scripts. These are the primary script components used to customize ImageTargets in a HoloLens application.

Vuforia Image

For the Image Target Behaviour script, the Database dropdown selects the database that contains the Image Target assigned to this Image Target Behaviour.

The Image Target dropdown defines which target in the database to assign to this Image Target Behaviour.

The Default Trackable Event Handler script handles callbacks to the Image Target Behaviour arising from changes in the state of the Image Target, such as when it has been detected and is then being tracked.

This script is used to enable and disable rendering and collision detection on digital content that is a child of the target. Extend this script's OnTrackingFound() and OnTrackingLost() methods to implement custom event handling for your app.

    protected virtual void OnTrackingFound()
    {
        var rendererComponents = GetComponentsInChildren<Renderer>(true);
        var colliderComponents = GetComponentsInChildren<Collider>(true);
        var canvasComponents = GetComponentsInChildren<Canvas>(true);

        // Enable rendering:
        foreach (var component in rendererComponents)
            component.enabled = true;

        // Enable colliders:
        foreach (var component in colliderComponents)
            component.enabled = true;

        // Enable canvas':
        foreach (var component in canvasComponents)
            component.enabled = true;
    }


    protected virtual void OnTrackingLost()
    {
        var rendererComponents = GetComponentsInChildren<Renderer>(true);
        var colliderComponents = GetComponentsInChildren<Collider>(true);
        var canvasComponents = GetComponentsInChildren<Canvas>(true);

        // Disable rendering:
        foreach (var component in rendererComponents)
            component.enabled = false;

        // Disable colliders:
        foreach (var component in colliderComponents)
            component.enabled = false;

        // Disable canvas':
        foreach (var component in canvasComponents)
            component.enabled = false;
    }