Getting Started with Vuforia Engine for Windows 10 Development

Vuforia Engine 6 introduced support for the development of Universal Windows Platform (UWP) apps for select Intel-based Windows 10 devices, including Microsoft Surface and HoloLens. A version of the Vuforia Engine is available for both native and Unity UWP development as well as UWP sample projects demonstrating the use of VuMark and ImageTargets in both C++ and C# for Unity.

For information about event logs generated by Vuforia Engine, refer to the Capturing and Viewing Event Logs with ETW article. 

The Engine SDK and samples both utilize the Vuforia Engine C++ API for UWP native development and the C# APIs for Unity development.

Native Development

  • Vuforia UWP SDK for Windows 10
  • Vuforia UWP Native Samples for Windows 10
    • Image Targets
    • VuMark

Unity Development

Supported tools

  • Visual Studio 2015 ( update 2 recommended )
  • Direct3D11

Additional information about the OS, tool, and device versions supported by Vuforia Engine is available here.

Supported Devices

The Vuforia Engine Supported Versions page lists the supported operating system, tool, and device versions for developing apps with the Vuforia Engine platform. Additional information about supported devices can be found here.

Developing for Windows 10 in Unity

Getting Started with Vuforia Engine for Unity Development

Supported Versions

  • Visual Studio 2015 (update 2 or later recommended)
  • Unity 2017.2 (or later)

Installation and Configuration

You’ll need to install the above versions of Visual Studio and Unity and then configure Unity to use Visual Studio as the preferred IDE and compiler. You’ll also need to install Visual Studios tools for Unity, see the article below for a link to this installer.
  See:
Getting Started with Visual Studio Tools for Unity - MSDN
Unity - Manual: Visual Studio C# Integration

When installing Unity, be sure to install the Universal Windows Platform, and also the Net Scripting Backend. Universal Windows Platform components can also be installed afterwards, from the Build Settings dialog when the Universal Windows Platform is selected.

Building and Executing the Sample

Start with one of the Vuforia HoloLens Samples to understand the structure and organization of a Vuforia Windows 10 Unity project.  

The samples are complete Unity projects that include a pre-configured scene(s) that implements Image Targets or VuMarks. You can easily build these to evaluate each feature, by selecting Universal Windows Platform as the target platform and then pressing the Build button in Build Settings. See the Building and Executing the Sample section below.

When you build a Universal Windows Platform app, Unity generates a Visual Studio project and launches the Visual Studio IDE. You can then build and run the project from Visual Studio.

To build the sample, add all of the scenes in the /Scenes folder to Scenes In Build in the File > Build Settings dialog. Select Universal Windows Platform as the Build Platform.

Note: You may need to install additional Unity Universal Windows Platform components if you had not installed these when you had originally installed the Unity Editor – the dialog will direct you to the necessary installer.

Selecting Build for Universal Windows Platform generates a native project directory and solution for those platforms. You will then need to build an executable from these resources Visual Studio.

  • Set your platform build target for Universal Windows Platform in File > Build Settings.
  • Add your scene(s) to Scenes in Build
  • Define a unique Product Name to serve as the name of the app when installed on a device
  • Press the Build button to generate a Visual Studio project.
  • Build the executable from Visual Studio and install it on your device.

Be sure to set your build target for x86.

To package your app for all UWP devices, See: Packaging Universal Windows apps for Windows 10


Working with ImageTargets in Visual Studio 2015

This article will guide you through the steps for setting up the Vuforia UWP SDK and ImageTargets sample in Visual Studio 2015.

Installing the SDK and Sample

  1. Unpack the Vuforia UWP SDK at a suitable location in your development environment
  2. Go to the samples directory in the SDK root folder
  3. Unpack the ImageTarget sample into this folder
  4. Double-click the *.sln file in the ImageTarget sample folder to load the sample project within Visual Studio 2015.

 

Note: This image shows the VuMark.sln from the VuMark sample, an ImageTarget.sln is provided w/ the ImageTarget Sample.

Running the Sample

  1. Use the License Manager to create a license key. Refer to the Vuforia License Manager article for more information. 
  2. Add the license key to your project. Refer to the How to Add a License Key to Your Vuforia Engine App article for more information. 
  3. Build and run the sample. 

Note: If Visual Studio does not recognize the include path for the sample project, you’ll need to add it via the Properties dialog for the ImageTarget project by right-clicking the project name in the project inspector. Add the Vuforia folder to the include path.

Be sure to set your build target for x86, note that 64bit builds are not supported.

To package your app for all UWP devices, See: https://msdn.microsoft.com/en-us/library/hh454036.aspx

Modifying the Sample

Obtaining the Trackable State

The State object contains references to all current TrackableResults. You can obtain if from the Vuforia Renderer instance, as shown in the Render() method in ImageTargetRenderer.cpp.

In ImageTargetRenderer.cpp:

    // Renders one frame using the vertex and pixel shaders.
    void ImageTargetsRenderer::Render()
    {
        // Vuforia initialization and data loading is asynchronous.
        // Only starts rendering after Vuforia init/loading is complete.
        if (!m_rendererInitialized || !m_vuforiaStarted)
        {
            return;
        }
    
        // Get the state from Vuforia and mark the beginning of a rendering section
        Vuforia::DXRenderData dxRenderData(m_deviceResources->GetD3DDevice());
        Vuforia::Renderer &vuforiaRenderer = Vuforia::Renderer::getInstance();
        Vuforia::State state = vuforiaRenderer.begin(&dxRenderData);
    
        // TODO: set culling depending on camera direction
    
        RenderScene(vuforiaRenderer, state);
    
        Vuforia::Renderer::getInstance().end();
    }

Querying Trackable Results

Once you have the State object, you can query the state of each TrackableResult to access its pose, determine its type, and obtain a reference to its associated Trackable instance.

In ImageTargetRenderer.cpp:

    void ImageTargetsRenderer::RenderScene(Vuforia::Renderer &renderer, 
    Vuforia::State &state) 
    {
        Concurrency::critical_section::scoped_lock lock(m_renderingPrimitivesLock);
        auto context = m_deviceResources->GetD3DDeviceContext();
    
        XMMATRIX xmProjection = XMLoadFloat4x4(&m_projection);
    
        // Set state for video background rendering
        context->RSSetState(m_videoRasterState.Get());
        context->OMSetDepthStencilState(m_videoDepthStencilState.Get(), 1);
        context->OMSetBlendState(m_videoBlendState.Get(), NULL, 0xffffffff);
    
        // Draw the video background:
        renderer.drawVideoBackground();
    
        // Set state for augmentation rendering
    if (Vuforia::Renderer::getInstance().getVideoBackgroundConfig().mReflection == Vuforia::VIDEO_BACKGROUND_REFLECTION_ON)
                   context->RSSetState(m_augmentationRasterState.Get()); //Back camera
    
        context->OMSetDepthStencilState(m_augmentationDepthStencilState.Get(), 1);
        context->OMSetBlendState(m_augmentationBlendState.Get(), NULL, 0xffffffff);
    
        for (int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
        {
            // Get the trackable:
            const Vuforia::TrackableResult *result = state.getTrackableResult(tIdx);
            const Vuforia::Trackable &trackable = result->getTrackable();
            const char* trackableName = trackable.getName();
    
            // Set up the modelview matrix
            auto poseGL = Vuforia::Tool::convertPose2GLMatrix(result->getPose());
            XMFLOAT4X4 poseDX;
            memcpy(poseDX.m, poseGL.data, sizeof(float) * 16);
            XMStoreFloat4x4(&poseDX, XMMatrixTranspose(XMLoadFloat4x4(&poseDX)));
            XMMATRIX xmPose = XMLoadFloat4x4(&poseDX);
    
            std::shared_ptr<SampleCommon::Texture> texture = GetAugmentationTexture(trackable.getName());
            if (!texture->IsInitialized()) {
                texture->Init();
            }
    
            if (m_extTracking) {
                RenderTower(xmPose, xmProjection, texture);
            }
            else {
                RenderTeapot(xmPose, xmProjection, texture);
            }
        }
    }

Rendering Content

The pose obtained from this result can then be used to render content onto the ImageTarget in the camera view.

In ImageTargetRenderer.cpp:

    void ImageTargetsRenderer::RenderTeapot(
        const XMMATRIX &poseMatrix,
        const XMMATRIX &projectionMatrix,
        const std::shared_ptr<SampleCommon::Texture> texture
        )
    {
        auto context = m_deviceResources->GetD3DDeviceContext();
     
        auto scale = XMMatrixScaling(TEAPOT_SCALE, TEAPOT_SCALE, TEAPOT_SCALE);
        auto modelMatrix = XMMatrixIdentity() * scale;
    
        // Set the model matrix (the 'model' part of the 'model-view' matrix)
        XMStoreFloat4x4(&m_constantBufferData.model, modelMatrix);
    
        // Set the pose matrix (the 'view' part of the 'model-view' matrix)
        XMStoreFloat4x4(&m_constantBufferData.view, poseMatrix);
    
        // Set the projection matrix
        XMStoreFloat4x4(&m_constantBufferData.projection, projectionMatrix);
    
        // Prepare the constant buffer to send it to the graphics device.
        context->UpdateSubresource1(
            m_constantBuffer.Get(),
            0,
            NULL,
            &m_constantBufferData,
            0,
            0,
            0
            );
    
        // Each vertex is one instance of the TexturedVertex struct.
        UINT stride = sizeof(SampleCommon::TexturedVertex);
        UINT offset = 0;
        context->IASetVertexBuffers(
            0,
            1,
            m_teapotMesh->GetVertexBuffer(),
            &stride,
            &offset
            );
    
        context->IASetIndexBuffer(
            m_teapotMesh->GetIndexBuffer(),
            DXGI_FORMAT_R16_UINT, // Each index is one 16-bit unsigned integer (short).
            0
            );
    
        context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
    
        context->IASetInputLayout(m_inputLayout.Get());
    
        // Attach our vertex shader.
        context->VSSetShader(m_vertexShader.Get(), nullptr, 0);
    
        // Send the constant buffer to the graphics device.
        context->VSSetConstantBuffers1(
            0,
            1,
            m_constantBuffer.GetAddressOf(),
            nullptr,
            nullptr
            );
    
        // Attach our pixel shader.
        context->PSSetShader(m_pixelShader.Get(), nullptr, 0);
    
        context->PSSetSamplers(0, 1, texture->GetSamplerState());
        context->PSSetShaderResources(0, 1, texture->GetTextureView());
    
        // Draw the objects.
        context->DrawIndexed(m_teapotMesh->GetIndexCount(), 0, 0);
  }