Getting Started with Vuforia Engine for Windows 10 Development

This article explains how to start developing for Windows 10 with Vuforia Engine in both Unity and Visual Studio. Vuforia Engine supports the development of Universal Windows Platform (UWP) apps on select Intel and ARM based Windows 10 devices, including Microsoft Surface and HoloLens. The Engine SDK and Vuforia Samples use C# APIs for Unity development and C++ APIs for native development.

The Vuforia Engine Supported Versions page lists the operating systems and tools that are supported for developing apps with the Engine platform. Additional information about supported devices can be found here. You can also use the Direct3D 12 programmable pipeline to create realtime 3D graphics for games as well as scientific and desktop applications.

For information about event logs generated by Vuforia Engine, refer to the Capturing and Viewing Event Logs with ETW article

Developing for Windows 10 in Unity

Before proceeding, familiarize yourself with the contents of the Getting Started with Vuforia Engine for Unity Development article. It contains information on installing Unity, activating Vuforia Engine within your project, and accessing Vuforia Engine features. 

When you build a UWP app in Unity, a Visual Studio project is generated and launched.  You can then build and run the project from Visual Studio. For information on packaging your app for all UWP devices, refer to the Getting Started with Windows UWP Apps article.

The HoloLens and Core samples, available in the Developer Portal or the Unity Asset Store, lay out the structure and organization of a Vuforia Windows 10 Unity project. The samples are complete Unity projects, including pre-configured scenes that implement Image Targets, Model Targets, and VuMarks. 

To gain a better understanding of how to develop for Windows 10 in Unity, start by performing the steps in the Working with the HoloLens Sample in Unity article.

Developing for Windows 10 in Visual Studio

This section explains how to set up the Vuforia UWP SDK and the Vuforia samples in the latest supported Visual Studio version.

  1. Download the Vuforia UWP SDK.
  2. Unpack the Vuforia UWP SDK to a folder in your development environment.
  3. Download the Core Features Samples for UWP.
  4. Unpack the Vuforia Samples Core UWP in to the samples folder located in the Vuforia UWP SDK folder.
  5. In the  folder, open the VuforiaSamples UWP folder and double-click the VuforiaSamples.sln.
    The sample project loads in Visual Studio.

Building and Executing the Sample

  1. Use the License Manager to create a license key. Refer to the Vuforia License Manager article for more information.
  2. Add the license key to your project. Refer to the How to Add a License Key to Your Vuforia Engine App article for instructions. 
  1. In the Solution Platforms dropdown, select x86 or x64.
    NOTE: 64-bit builds on HoloLens 1 are not supported.
Vuforia Image
  1. Build and run the sample.
    NOTE: If Visual Studio does not recognize the include path for the sample project, perform the following steps:
    1. In the Solution Explorer window, right-click the VuforiaSamples project and select Properties.
    2. Expand the General menu.
    3. In the Include Directories field, add the path to the include directories.
Vuforia Image
  1.     Package your app for all UWP devices by following the instructions in the Packaging Universal Windows apps for Windows 10 article. 

Modifying the Sample

Obtaining the Trackable State

The State object contains references to all current TrackableResults. You can obtain it from the Vuforia::Rendererinstance, as shown in the Render() method.

The following code is contained in ImageTargetRenderer.cpp


void ImageTargetsRenderer::Render()
{
    // Vuforia initialization and data loading is asynchronous.
    // Only starts rendering after Vuforia init/loading is complete.
    if (!m_rendererInitialized || !m_vuforiaStarted)
    {
        return;
    }

    // Get the state from Vuforia and mark the beginning of a rendering section
    Vuforia::DXRenderData dxRenderData(m_deviceResources->GetD3DDevice());
    const Vuforia::State state = Vuforia::TrackerManager::getInstance().getStateUpdater().updateState();

    Vuforia::Renderer&
        vuforiaRenderer = Vuforia::Renderer::getInstance();
    vuforiaRenderer.begin(state, &dxRenderData);

    RenderScene(vuforiaRenderer, state);

    vuforiaRenderer.end();
}

Querying Trackable Results

Once you have the State object, you can query the state of each TrackableResult to access its pose, determine its type, and obtain a reference to its associated Trackable instance.

The following code is contained in ImageTargetRenderer.cpp


void ImageTargetsRenderer::RenderScene(Vuforia::Renderer& renderer, const Vuforia::State& state)
{
    Concurrency::critical_section::scoped_lock lock(m_renderingPrimitivesLock);

    auto context = m_deviceResources ->
        GetD3DDeviceContext();

    // Calculate the DX Projection matrix using the current Vuforia state
    auto projectionMatrix =
        Vuforia::Tool::convertPerspectiveProjection2GLMatrix(
            m_renderingPrimitives->getProjectionMatrix(
                Vuforia::VIEW_SINGULAR, state.getCameraCalibration()),
            m_near,
            m_far);

    XMFLOAT4X4 dxProjection;
    memcpy(dxProjection.m, projectionMatrix.data, sizeof(float) * 16);
    XMStoreFloat4x4(&dxProjection, XMMatrixTranspose(XMLoadFloat4x4(&dxProjection)));

    XMMATRIX xmProjectionMatrix = XMLoadFloat4x4(&dxProjection);

    // Render the camera video background
    m_videoBackground ->
        Render(renderer, m_renderingPrimitives.get(), Vuforia::VIEW_SINGULAR, state);

    // Setup rendering pipeline for augmentation rendering
    context ->
        RSSetState(m_augmentationRasterStateCullBack.Get()); // Typically when using the rear facing camera

    context ->
        OMSetDepthStencilState(m_augmentationDepthStencilState.Get(), 1);
    context ->
        OMSetBlendState(m_augmentationBlendState.Get(), NULL, 0xffffffff);

    DirectX::XMFLOAT4X4 devicePoseMatrixDX;
    XMStoreFloat4x4(&devicePoseMatrixDX, XMMatrixIdentity());
    Vuforia::Matrix44F devicePoseMatrix = SampleAppMathUtils::Matrix44FIdentity();

    // Get the device pose
    if (state.getDeviceTrackableResult() != nullptr & &
        state.getDeviceTrackableResult()->getStatus() != Vuforia::TrackableResult::NO_POSE)
    {
        Vuforia::Matrix44F modelMatrix =
            Vuforia::Tool::convertPose2GLMatrix(state.getDeviceTrackableResult()->getPose());

        devicePoseMatrix =
            SampleAppMathUtils::Matrix44FTranspose(SampleAppMathUtils::Matrix44FInverse(modelMatrix));

        SampleAppMathUtils::convertPoseFromGLtoDX(devicePoseMatrix, devicePoseMatrixDX);
    }

    ProcessTrackableResults(state, devicePoseMatrixDX, xmProjectionMatrix);
}

Rendering Content

The pose obtained from this result can then be used to render content onto the ImageTarget in the camera view.

The following code is contained in ImageTargetRenderer.cpp.

void ImageTargetsRenderer::RenderTeapot(
    const XMMATRIX& poseMatrix,
    const XMMATRIX& projectionMatrix,
    const std::shared_ptr<SampleCommon::Texture> texture)
{
    auto context = m_deviceResources ->
        GetD3DDeviceContext();

    auto scale = XMMatrixScaling(TEAPOT_SCALE, TEAPOT_SCALE, TEAPOT_SCALE);
    auto modelMatrix = XMMatrixIdentity() * scale;

    // Set the model matrix (the 'model' part of the 'model-view' matrix)
    XMStoreFloat4x4(&m_augmentationConstantBufferData.model, modelMatrix);

    // Set the pose matrix (the 'view' part of the 'model-view' matrix)
    XMStoreFloat4x4(&m_augmentationConstantBufferData.view, poseMatrix);

    // Set the projection matrix
    XMStoreFloat4x4(&m_augmentationConstantBufferData.projection, projectionMatrix);

    // Prepare the constant buffer to send it to the graphics device.
    context ->
        UpdateSubresource1(
            m_augmentationConstantBuffer.Get(),
            0,
            NULL,
            &
            m_augmentationConstantBufferData,
            0,
            0,
            0);

    // Each vertex is one instance of the TexturedVertex struct.
    UINT stride = sizeof(SampleCommon::TexturedVertex);
    UINT offset = 0;
    context ->
        IASetVertexBuffers(
            0,
            1,
            m_teapotMesh ->
            GetVertexBuffer().GetAddressOf(),
            &
            stride,
            &
            offset);

    context ->
        IASetIndexBuffer(
            m_teapotMesh->GetIndexBuffer().Get(),
            DXGI_FORMAT_R16_UINT, // Each index is one 16-bit unsigned integer (short).
            0);

    context ->
        IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);

    context ->
        IASetInputLayout(m_augmentationInputLayout.Get());

    // Attach our vertex shader.
    context ->
        VSSetShader(m_augmentationVertexShader.Get(), nullptr, 0);

    // Send the constant buffer to the graphics device.
    context ->
        VSSetConstantBuffers1(
            0,
            1,
            m_augmentationConstantBuffer.GetAddressOf(),
            nullptr,
            nullptr);

    // Attach our pixel shader.
    context ->
        PSSetShader(m_augmentationPixelShader.Get(), nullptr, 0);

    context ->
        PSSetSamplers(0, 1, texture->GetD3DSamplerState().GetAddressOf());

    context ->
        PSSetShaderResources(0, 1, texture->GetD3DTextureView().GetAddressOf());

    // Draw the objects.
    context ->
        DrawIndexed(m_teapotMesh->GetIndexCount(), 0, 0);
}

Related Pages

Working with the HoloLens Sample in Unity

Getting Started with Vuforia Engine for Unity Development

UWP SDK

Vuforia UWP SDK

Vuforia UWP Sample