How to Use Object Recognition in an Android App

In this article, you will learn how to integrate Object Recognition into your native Android app. We will use the Core Android sample app as a foundation for the project structure.

Before you proceed, please make sure to have read Getting Started with the Android Native SDK.

Use the Vuforia Core Android Sample project structure as a template for your own app. The setup comprises of 3 steps:

  1. Get a Vuforia Engine license key and include it in your app.
  2. Replace the Image Target database with a device database containing one or more Object Target datasets.
  3. Configure the Renderer to display a box or model over the Object Target. 

Get a license key and add it to your app

Please follow the Vuforia License Manager guide on how to create a license and thereafter follow the guide on How to Add a License Key to Your Vuforia App to properly set up your native app.

Load and activate Device Databases containing Object Targets

The Object Target is loaded and tracked by the ObjectTracker. Once it is initialized, you can load the device database for your Object Targets that is stored as an .dat and .xml file. Simply copy your dataset files .dat and .xml into the asset folder in the sample folder structure. 

In AppController.cpp, include the following classes:

#include <Vuforia/ObjectTarget.h>
#include <Vuforia/ObjectTargetResult.h>

Replace the IMAGE_TARGET_ID with OBJECT TARGET_ID in the AppController.h file.

int mTarget = OBJECT_TARGET_ID;

In AppController.cpp, you can now load and activate the Object Target dataset instead of the existing Image Target dataset in LoadTrackerData() and adjust the function calls to locate Object Targets in getImageTargetResult():

bool AppController::loadTrackerData()
{
    if (mTarget == OBJECT_TARGET_ID)
    {
        mCurrentDataSet = loadAndActivateDataSet("VuforiaMars_Object_OT.xml");
        if (mCurrentDataSet == nullptr)
        {
            mShowErrorCallback("Error loading dataset for Object Target");
            return false;
        }
    }
    
    bool AppController::getImageTargetResult(Vuforia::Matrix44F& projectionMatrix,
    Vuforia::Matrix44F& modelViewMatrix,
    Vuforia::Matrix44F& scaledModelViewMatrix)
    {
        const auto& trackableResultList = mVuforiaState.getTrackableResults();
        for (const auto* result : trackableResultList)
        {
            
            if (result->isOfType(Vuforia::ObjectTargetResult::getClassType()) && mTarget == OBJECT_TARGET_ID)
            {
                const Vuforia::ObjectTargetResult* itResult = static_cast<const Vuforia::ObjectTargetResult*>(result);
                const Vuforia::ObjectTarget& target = itResult->getTrackable();
                
                Vuforia::Matrix44F viewMatrix = Vuforia::Tool::convertPose2GLMatrix(mVuforiaState.getDeviceTrackableResult()->getPose());
                viewMatrix = MathUtils::Matrix44FTranspose(MathUtils::Matrix44FInverse(viewMatrix));
                
                // Get the projection matrix
                projectionMatrix = Vuforia::Tool::convertPerspectiveProjection2GLMatrix(
                mCurrentRenderingPrimitives->getProjectionMatrix(Vuforia::VIEW_SINGULAR,
                mVuforiaState.getCameraCalibration()),
                NEAR_PLANE, FAR_PLANE);
                
                // Get object pose and populate modelViewMatrix
                modelViewMatrix = Vuforia::Tool::convertPose2GLMatrix(result->getPose());
                MathUtils::multiplyMatrix(viewMatrix, modelViewMatrix, modelViewMatrix);
                
                // Calculate a scaled modelViewMatrix for rendering a unit bounding box
                auto targetSize = target.getSize();
                scaledModelViewMatrix = MathUtils::Matrix44FScale(targetSize, modelViewMatrix);
                
                return true;
            }
        }
        
        return false;
    }
}

Configure the Renderer 

To change the augmentation content, changes should be made in the GLESRenderer.cpp in the function renderImageTarget(). Either replace the model of the Astronaut with one of your own or add the following to render an opaque cube:

// Draw Cube
glUniform4f(mUniformColorColorHandle, 1.0, 0.0, 0.0, 0.1);
glDrawElements(GL_TRIANGLES, NUM_CUBE_INDEX, GL_UNSIGNED_SHORT, (const GLvoid*)&cubeIndices[0]);

It may be necessary to realign the center of the augmented content to the Object Target as the center axis is originating from the Object Vuforia Scanner target image. To change the placement of your content, use the TranslateMatrix()

MathUtils::translateMatrix({ -0.03f, 0, -0.02f }, adjustedModelViewMatrix); // Move to center

Learn More

ObjectTracker API Overview

How to Use Object Recognition in Unity