Getting Started with Vuforia for Windows 10 Development

Vuforia 6 introduces support for the development of Windows UWP apps for select Intel based Windows 10 devices, including Microsoft Surface and HoloLens. A new UWP version of the Vuforia SDK has been introduced, for both native and Unity development, as well as two sample projects demonstrating the use of VuMark and ImageTargets in both C++ and C# for Unity.

The SDK and samples both utilize the Vuforia C++ API for UWP native development and the C# APIs for Unity development.

Release Artifacts

Native Development

  • Vuforia UWP SDK for Windows 10
  • Vuforia UWP Native Samples for Windows 10
    • Image Targets
    • VuMark

Unity Development

  • Vuforia Unity Extension for Windows 10
  • Vuforia Unity Samples for Windows 10
    • Image Targets
    • VuMark

Samples

You can find the following samples in the Core Features archive for UWP

  • Image Targets
  • VuMark

Supported tools

  • Visual Studio 2015 ( update 2 recommended )
  • Direct3D11

You can find additional information about OS, tool and device versions supported by Vuforia here

Supported devices

Additional information about supported devices can be found here

Note: Vuforia supports x86 UWP builds only. ARM UWP builds are not supported.


Developing for Windows 10 in Unity

Supported Versions:
⦁    Visual Studio 2015 ( update 2 recommended )
⦁    Unity 5.4.0f3

Installation and Configuration

You’ll need to install the above versions of Visual Studio and Unity and then configure Unity to use Visual Studio as the preferred IDE and compiler. You’ll also need to install Visual Studios tools for Unity, see the article below for a link to this installer.
  See:
Getting Started with Visual Studio Tools for Unity - MSDN
Unity - Manual: Visual Studio C# Integration

When installing Unity, be sure to install the Windows Store platform, and also the .Net Scripting Backend. Windows Store components can also be installed afterwards, from the Build Settings dialog when the Windows Store platform is selected.

Getting Started

Start with one of the Vuforia Unity samples for UWP to understand the structure and organization of a Vuforia Windows 10 Unity project.  The Vuforia UWP extension and samples for Unity are nearly identical to those offered for Android and iOS. The only difference is that Windows Store builds from Unity use a platform specific set of compiled Vuforia plugin libraries.

The samples are complete Unity projects that include the Vuforia Unity Extension for Win 10 as well as a pre-configured scene that implements Image Targets, or VuMarks. You can easily build these to evaluate each feature, by selecting Windows Store as the target platform and then pressing the Build button in Build Settings. See Building and executing the sample below.

When you build a Windows Store app, Unity generates a Visual Studio project and launches the Visual Studio IDE. You can then build and run the project from Visual Studio.

Installing the Unity Extension
See: Getting Started with Vuforia for Unity Development

Installing the Unity Samples
 

Build Settings

To build the sample, add all of the scenes in the /Scenes folder to Scenes In Build in the File > Build Settings dialog. Select Windows Store as the Build Platform.

Note:  You may need to install additional Unity Windows Store components if you had not selected to install these when you had originally installed the Unity Editor – the dialog will direct you to the necessary installer.

Selecting Build for Windows Store will generate a native project directory and solution for those platforms.  You will then need to build an executable from these resources Visual Studio. 

Building and executing the sample
⦁    Set your platform build target for Windows Store in File > Build Settings.
⦁    Add your scene(s) to Scenes in Build.
⦁    Define a unique Product Name to serve as the name of the app when installed on a device.
⦁    Press the Build button to generate a Visual Studio project
⦁    Build the executable from Visual Studio and install it on your device

Visual Studio Build Configuration

Be sure to set your build target for x86.

To package your app for all UWP devices, See: Packaging Universal Windows apps for Windows 10

 


Working with ImageTargets in Visual Studio 2015

This article will guide you through the steps for setting up the Vuforia UWP SDK and ImageTargets sample in Visual Studio 2015.
  Installing the SDK and Sample

⦁    Unpack the Vuforia UWP SDK at a suitable location in your development environment
⦁    Go to the samples directory in the SDK root folder
⦁    Unpack the ImageTarget sample into this folder
⦁    Double click the *.sln file in the ImageTarget sample folder to load the sample project within Visual Studio 2015


Note: this image shows the VuMark.sln from the VuMark sample, an ImageTarget.sln is provided w/ the ImageTarget Sample.
Running the sample

⦁    Obtain a license key from developer.vuforia.com and add it to your project, See: Vuforia License Manager
⦁    Add this key to the the InitAR() method in AppSession.cpp
User-added image
⦁    You can now build and run the sample. 


Note: If Visual Studio does not recognize the include path for the sample project, you’ll need to add it via the Properties dialog for the ImageTarget project by right clicking the project name in the project inspector. Add the Vuforia folder to the include path. 
 
Be sure to set your build target for x86, note that 64bit builds are not supported.

To package your app for all UWP devices, See: https://msdn.microsoft.com/en-us/library/hh454036.aspx

Modifying the Sample

Obtaining the Trackable State

The State object contains references to all current TrackableResults. You can obtain if from the Vuforia Renderer instance, as shown in the Render() method in ImageTargetRenderer.cpp.

In ImageTargetRenderer.cpp:
 

// Renders one frame using the vertex and pixel shaders.
void ImageTargetsRenderer::Render()
{
    // Vuforia initialization and data loading is asynchronous.
    // Only starts rendering after Vuforia init/loading is complete.
    if (!m_rendererInitialized || !m_vuforiaStarted)
    {
        return;
    }

    // Get the state from Vuforia and mark the beginning of a rendering section
    Vuforia::DXRenderData dxRenderData(m_deviceResources->GetD3DDevice());
    Vuforia::Renderer &vuforiaRenderer = Vuforia::Renderer::getInstance();
    Vuforia::State state = vuforiaRenderer.begin(&dxRenderData);

    // TODO: set culling depending on camera direction

    RenderScene(vuforiaRenderer, state);

    Vuforia::Renderer::getInstance().end();
}

Querying Trackable Results

Once you have the State object, you can query the state of each TrackableResult to access its pose, determine its type, and obtain a reference to its associated Trackable instance.

In ImageTargetRenderer.cpp:
 

void ImageTargetsRenderer::RenderScene(Vuforia::Renderer &renderer, Vuforia::State &state)
{
    Concurrency::critical_section::scoped_lock lock(m_renderingPrimitivesLock);

    auto context = m_deviceResources->GetD3DDeviceContext();

    XMMATRIX xmProjection = XMLoadFloat4x4(&m_projection);

    // Set state for video background rendering
    context->RSSetState(m_videoRasterState.Get());
    context->OMSetDepthStencilState(m_videoDepthStencilState.Get(), 1);
    context->OMSetBlendState(m_videoBlendState.Get(), NULL, 0xffffffff);

    // Draw the video background:
    renderer.drawVideoBackground();

    // Set state for augmentation rendering
if (Vuforia::Renderer::getInstance().getVideoBackgroundConfig().mReflection == Vuforia::VIDEO_BACKGROUND_REFLECTION_ON)
        context->RSSetState(m_augmentationRasterState.Get()); //Front camera
    else
        context->RSSetState(m_augmentationRasterState.Get()); //Back camera

    context->OMSetDepthStencilState(m_augmentationDepthStencilState.Get(), 1);
    context->OMSetBlendState(m_augmentationBlendState.Get(), NULL, 0xffffffff);

    for (int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
    {
        // Get the trackable:
        const Vuforia::TrackableResult *result = state.getTrackableResult(tIdx);
        const Vuforia::Trackable &trackable = result->getTrackable();
        const char* trackableName = trackable.getName();

        // Set up the modelview matrix
        auto poseGL = Vuforia::Tool::convertPose2GLMatrix(result->getPose());
        XMFLOAT4X4 poseDX;
        memcpy(poseDX.m, poseGL.data, sizeof(float) * 16);
        XMStoreFloat4x4(&poseDX, XMMatrixTranspose(XMLoadFloat4x4(&poseDX)));
        XMMATRIX xmPose = XMLoadFloat4x4(&poseDX);

        std::shared_ptr<SampleCommon::Texture> texture = GetAugmentationTexture(trackable.getName());
        if (!texture->IsInitialized()) {
            texture->Init();
        }

        if (m_extTracking) {
            RenderTower(xmPose, xmProjection, texture);
        }
        else {
            RenderTeapot(xmPose, xmProjection, texture);
        }
    }
}

Rendering Content

The pose obtained from this result can then be used to render content onto the ImageTarget in the camera view.

In ImageTargetRenderer.cpp:
 

void ImageTargetsRenderer::RenderTeapot(
    const XMMATRIX &poseMatrix,
    const XMMATRIX &projectionMatrix,
    const std::shared_ptr<SampleCommon::Texture> texture
    )
{
    auto context = m_deviceResources->GetD3DDeviceContext();
 
    auto scale = XMMatrixScaling(TEAPOT_SCALE, TEAPOT_SCALE, TEAPOT_SCALE);
    auto modelMatrix = XMMatrixIdentity() * scale;

    // Set the model matrix (the 'model' part of the 'model-view' matrix)
    XMStoreFloat4x4(&m_constantBufferData.model, modelMatrix);

    // Set the pose matrix (the 'view' part of the 'model-view' matrix)
    XMStoreFloat4x4(&m_constantBufferData.view, poseMatrix);

    // Set the projection matrix
    XMStoreFloat4x4(&m_constantBufferData.projection, projectionMatrix);

    // Prepare the constant buffer to send it to the graphics device.
    context->UpdateSubresource1(
        m_constantBuffer.Get(),
        0,
        NULL,
        &m_constantBufferData,
        0,
        0,
        0
        );

    // Each vertex is one instance of the TexturedVertex struct.
    UINT stride = sizeof(SampleCommon::TexturedVertex);
    UINT offset = 0;
    context->IASetVertexBuffers(
        0,
        1,
        m_teapotMesh->GetVertexBuffer(),
        &stride,
        &offset
        );

    context->IASetIndexBuffer(
        m_teapotMesh->GetIndexBuffer(),
        DXGI_FORMAT_R16_UINT, // Each index is one 16-bit unsigned integer (short).
        0
        );

    context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);

    context->IASetInputLayout(m_inputLayout.Get());

    // Attach our vertex shader.
    context->VSSetShader(m_vertexShader.Get(), nullptr, 0);

    // Send the constant buffer to the graphics device.
    context->VSSetConstantBuffers1(
        0,
        1,
        m_constantBuffer.GetAddressOf(),
        nullptr,
        nullptr
        );

    // Attach our pixel shader.
    context->PSSetShader(m_pixelShader.Get(), nullptr, 0);

    context->PSSetSamplers(0, 1, texture->GetSamplerState());
    context->PSSetShaderResources(0, 1, texture->GetTextureView());

    // Draw the objects.
    context->DrawIndexed(m_teapotMesh->GetIndexCount(), 0, 0);
}