Ground Plane Native Android Sample User Guide

The Vuforia Ground Plane sample for Android shows how to configure and initialize a Ground Plane app using the Vuforia Java API, and also on how to manage the UX and lifecycle of a Ground Plane experience.

This sample demonstrates recommended practices for managing Ground Plane app resources, the user experience and the included assets you can use as your own app-UI.

Index

The Ground Plane Native iOS Sample User Guide can be found here.

Project Organization

Importing the sample

  • Download the Vuforia Engine for Android
  • Unzip into your development environment
  • Download the Ground Plane sample for Android
  • Unzip into the Samples folder of your Vuforia Engine Android directory

NOTE: Ground Plane is supported with Vuforia Engine 7.0.36 and later. It is recommended that you use the Vuforia Engine version corresponding to each Ground Plane sample release to ensure API consistency. Ground Plane is only compatible with devices supported by Platform Enablers (ARKit/ARCore) or devices that have been specifically calibrated by Vuforia Engine. See Ground Plane Supported Devices for a list of officially supported devices.

     Project Structure

The Ground Plane sample libraries are found in the CoreSamples and SampleApplication folders. To customize and extend the sample, you may need to modify the following files.

GroundPlane 

Defines

  • App States and Activity
    • Start / Stop
    • Pause / Resume
    • Hit testing
    • UI Handling
  • Sample Control
  • Smart Terrain Tracker lifecycle

SampleApplicationSession

Defines

  • App Lifecycle

GroundPlaneRenderer

Defines

  • Ground Plane Sample Application Lifecycle
  • Content Video Background Rendering
  • OpenGL ES Management
  • Anchor positioning of the reticles

NOTE: Sample Application Session is part of the Vuforia Samples Applications Framework and not specific to the Ground Plane feature. You'll encounter this library in all of Vuforia Engine's native Android samples.
 

Vuforia Android Sample files structure

Sample Lifecycle

  1. Initialize Vuforia Engine
  2. Start the AR Session
  3. Monitor and Respond to State Changes
  4. Deinitialize Vuforia Engine

Initialize Vuforia Engine in (GroundPlane.java, line 119)

vuforiaAppSession = new SampleApplicationSession(this);

vuforiaAppSession 
        .initAR(this, ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

Start the AR session once initialization is complete (GroundPlane.java, line 423)

@Override 
public void onInitARDone(SampleApplicationException exception) 
{
……
 vuforiaAppSession.startAR();
……
}

Initialize trackers for Ground Plane (GroundPlane.java, line 553)

public boolean doInitTrackers()
    {
        // Initialize the Positional Device and Smart Terrain Trackers
        TrackerManager trackerManager = TrackerManager.getInstance();

        DeviceTracker deviceTracker = (PositionalDeviceTracker)
                trackerManager.initTracker(PositionalDeviceTracker.getClassType());

        Tracker smartTerrain = trackerManager.initTracker(SmartTerrain.getClassType());

        boolean trackersInitialized = true;

        if (deviceTracker != null)
        {
            Log.i(LOGTAG, "Successfully initialized Device Tracker");
        }
        else
        {
            Log.e(LOGTAG, "Failed to initialize Device Tracker");
            trackersInitialized = false;
        }

        if (smartTerrain != null)
        {
            Log.i(LOGTAG, "Successfully initialized Smart Terrain");
        }
        else
        {
            Log.e(LOGTAG, "Failed to initialize Smart Terrain");
            trackersInitialized = false;
        }

        if(!trackersInitialized)
        {
            showInitializationErrorMessage(getString(R.string.INIT_ERROR_TRACKERS_NOT_INITIALIZED), true);
        }

        mTrackersSuccessfullyInitialized = trackersInitialized;

        return trackersInitialized;
    }

Check the Trackable Results obtained from the renderFrame() function in the GroundPlaneRenderer file. These will provide poses for the device and any Anchors. You can use these results to register digital content.

Check trackable results (GroundPlaneRenderer.java, line 424)

if (state.getTrackableResults().empty())
    {
        Log.i(LOGTAG, "No trackables");
    }
    else
    {
        Matrix34F devicePoseTemp = new Matrix34F();
        boolean furnitureAnchorExists = false;
        TrackableResultList trackableResultList = state.getTrackableResults();
        // Determine if target is currently being tracked
        setIsTargetCurrentlyTracked(trackableResultList);
        // Iterate through trackable results and render any augmentations
        for (TrackableResult result : trackableResultList)
        {
            Matrix44F modelViewMatrix = Tool.convertPose2GLMatrix(result.getPose());
            Matrix44F projMatrix = new Matrix44F();
            projMatrix.setData(projectionMatrix);
            ....
            // Look for an anchor pose so that we can place the model there
            else if (result.isOfType(AnchorResult.getClassType()))
            {
                mIsAnchorResultAvailable = true;
                if (result.getTrackable().getName().equals(HIT_TEST_ANCHOR_NAME))
                {
                    renderAstronaut = true;
                    mHitTestPoseMatrix = modelViewMatrix;
                }
                if (result.getTrackable().getName().equals(MID_AIR_ANCHOR_NAME))
                {
                    renderDrone = true;
                    mMidAirPoseMatrix = modelViewMatrix;
                }
                if (result.getTrackable().getName().equals(FURNITURE_ANCHOR_NAME))
                {
                    furnitureAnchorExists = true;
                    if(!mRepositionFurniture)
                    {
                        renderFurniture = true;
                    }
                    mIsFurniturePlaced = true;
                    if (mIsModelTranslating)
                    {
                        updateFurnitureMatrix(state, translateCoords);
                    }
                    else
                    {
                        mFurniturePoseMatrix = modelViewMatrix;
                    }
                }
            }
        }

Render the anchor content (GroundPlaneRenderer.java, line 623)

// If we have the device and anchor results, we translate and scale the models
// so they are positioned at the center of the reticle.
// For the drone, we also want to align it vertically since it is in mid-air
      if (mIsDeviceResultAvailable && mIsAnchorResultAvailable)
      {
          if (renderFurniture)
          {
              float[] chairMV = mFurniturePoseMatrix.getData();
              float[] chairMVP = new float[16];
              Matrix.translateM(chairMV, 0, 0f, 0f, 0);
              Matrix.rotateM(chairMV, 0, mProductRotation + mNewProductRotation, 0.0f, 1.0f, 0.0f);
              Matrix.scaleM(chairMV, 0, mProductScale, mProductScale, mProductScale);
              Matrix.multiplyMM(chairMV, 0, mDevicePoseMatrix.getData(), 0, chairMV, 0);
              Matrix.multiplyMM(chairMVP, 0, projectionMatrix, 0, chairMV, 0);
              float shadowScale = mProductScale * 1.0f;
              float gesturesScale = mProductScale * 2.0f;
              float[] shadowMV = mFurniturePoseMatrix.getData();
              float[] gesturesMV = mFurniturePoseMatrix.getData();
              Matrix.rotateM(shadowMV, 0, -90, 1.0f, 0.0f, 0.0f);
              Matrix.scaleM(shadowMV, 0, shadowScale, shadowScale, shadowScale);
              Matrix.rotateM(gesturesMV, 0, -90, 1.0f, 0.0f, 0.0f);
              Matrix.scaleM(gesturesMV, 0, gesturesScale, gesturesScale, gesturesScale);
              GLES20.glDisable(GLES20.GL_DEPTH_TEST);
              Matrix44F projMatrix = new Matrix44F();
              projMatrix.setData(projectionMatrix);
              Matrix44F shadowMVMatrix = new Matrix44F();
              shadowMVMatrix.setData(shadowMV);
              Matrix44F gesturesMVMatrix = new Matrix44F();
              gesturesMVMatrix.setData(gesturesMV);
              // Renders the shadow that will be placed underneath the furniture
              renderPlaneTexturedWithProjectionMatrix(projMatrix, shadowMVMatrix, mTextures.get(SHADOW_TEXTURE_INDEX).mTextureID[0], true, false);
              // If any gestures are being performed on the furniture,
              // render the corresponding texture
              if(mProductPlacementState != PRODUCT_PLACEMENT_STATE_IDLE)
              {
                  int gestureTexture = isModelRotating() ? ROTATE_TEXTURE_INDEX : TRANSLATE_TEXTURE_INDEX;
                  renderPlaneTexturedWithProjectionMatrix(projMatrix, gesturesMVMatrix, mTextures.get(gestureTexture).mTextureID[0], false, false);
              }
              GLES20.glEnable(GLES20.GL_DEPTH_TEST);
              mFurniture.setRenderingColorCorrection(mColorCorrection, mIntensityCorrection);
              mFurniture.render(chairMV, chairMVP);
          }
          if (renderAstronaut)
          {
              float[] astronautMV = mHitTestPoseMatrix.getData();
              Matrix.translateM(astronautMV, 0, -0.30f, 0, 0);
              Matrix.scaleM(astronautMV, 0, 10f, 10f, 10f);
              renderModelV3D(mAstronaut, astronautMV,
                      projectionMatrix, SAMPLE_APP_INTERACTIVE_MODE);
          }
          if (renderDrone)
          {
              float[] droneMV = mMidAirPoseMatrix.getData();
              Matrix.translateM(droneMV, 0, -0.75f, -0.375f, -0.75f);
              Matrix.scaleM(droneMV, 0, 10f, 10f, 10f);
              renderModelV3D(mDrone, droneMV,
                      projectionMatrix, SAMPLE_APP_MIDAIR_MODE);
          }
      }
  }

For rendering the reticles, navigate to line 695 in the GroundPLaneRenderer.java.

Deinitialize Vuforia Engine for Android and the trackers (GroundPlane.java, line 669)

@Override
public boolean doDeinitTrackers()
{
    TrackerManager trackerManager = TrackerManager.getInstance();

    if (trackerManager.deinitTracker(PositionalDeviceTracker.getClassType()))
    {
        Log.i(LOGTAG, "Successfully deinit Device Tracker");
    }
    else
    {
        Log.e(LOGTAG, "Failed to deinit Device Tracker");
        return false;
    }

    if (trackerManager.deinitTracker(SmartTerrain.getClassType()))
    {
        Log.i(LOGTAG, "Successfully deinit Smart Terrain");
    }
    else
    {
        Log.e(LOGTAG, "Failed to deinit Smart Terrain");
        return false;
    }

    return true;
}

Anchor Creation, Event Handling, and UI Customization

Generating Anchor Points by Hit Testing (GroundPlaneRenderer.java, line 912)

private boolean performHitTest(State state, float normalTouchPointX, float normalTouchPointY,
                               boolean createAnchor)
{
    Log.i(LOGTAG, "Perform hit test with normalized touch point ("
            + normalTouchPointX + ", " + normalTouchPointY + ")");

    TrackerManager trackerManager = TrackerManager.getInstance();
    PositionalDeviceTracker deviceTracker = (PositionalDeviceTracker) trackerManager.getTracker(PositionalDeviceTracker.getClassType());
    SmartTerrain smartTerrain = (SmartTerrain) trackerManager.getTracker(SmartTerrain.getClassType());

    if (deviceTracker == null || smartTerrain == null)
    {
        Log.e(LOGTAG, "Failed to perform hit test, trackers not initialized");
        return false;
    }

    Vec2F hitTestPoint = new Vec2F(normalTouchPointX, normalTouchPointY);
    int hitTestHint = SmartTerrain.HITTEST_HINT.HITTEST_HINT_NONE; // hit test hint is currently unused

    // A hit test is performed for a given State at normalized screen coordinates.
    // The deviceHeight is a developer provided assumption as explained in the
    // definition of DEFAULT_HEIGHT_ABOVE_GROUND.
    HitTestResultList hitTestResults = smartTerrain.hitTest(hitTestPoint, hitTestHint, state, DEFAULT_HEIGHT_ABOVE_GROUND);

    if (!hitTestResults.empty())
    {
        // Use first HitTestResult
        final HitTestResult hitTestResult = hitTestResults.at(0);

        if (createAnchor)
        {
            createSurfaceAnchor(hitTestResult);
        }

        mReticlePose = Tool.convertPose2GLMatrix(hitTestResult.getPose());
        mIsAnchorResultAvailable = true;
        return true;
    }
    else
    {
        Log.i(LOGTAG, "Hit test returned no results");
        return false;
    }
}

Creating Anchor Points in Mid-Air (GroundPlaneRenderer.java, line 1015)

private void createMidAirAnchor(Matrix34F anchorPoseMatrix)
{
    Log.i(LOGTAG, "Create Mid Air Anchor");

    TrackerManager trackerManager = TrackerManager.getInstance();
    PositionalDeviceTracker deviceTracker = (PositionalDeviceTracker)
            trackerManager.getTracker(PositionalDeviceTracker.getClassType());

    if (mMidAirAnchor != null)
    {
        Log.i(LOGTAG, "Destroying hit test anchor with name " + MID_AIR_ANCHOR_NAME);
        boolean result = deviceTracker.destroyAnchor(mMidAirAnchor);
        Log.i(LOGTAG, "Hit test anchor " + (result ? "successfully destroyed" : "failed to destroy"));
    }

    mMidAirAnchor = deviceTracker.createAnchor(MID_AIR_ANCHOR_NAME, anchorPoseMatrix);

    if (mMidAirAnchor != null)
    {
        Log.i(LOGTAG, "Successfully created hit test anchor with name " + mMidAirAnchor.getName());
    }
    else
    {
        Log.e(LOGTAG, "Failed to create mid air anchor");
    }
    AnchorList anchors = deviceTracker.getAnchors();
    Log.i(LOGTAG, "Number of anchors: " + anchors.size());
}

Loading Textures and Reticles (GroundPlane.java, line 263)

Reticle textures are defined in the loadTextures() method and loaded from the /Assets/ folder

// Load specific textures from the APK, which we will later use for rendering.
private void loadTextures()
{
    mTextures.add(Texture.loadTextureFromApk("astronaut.png", getAssets()));
    mTextures.add(Texture.loadTextureFromApk("drone.png", getAssets()));
    mTextures.add(Texture.loadTextureFromApk("GroundPlane/reticle_interactive_2d.png", getAssets()));
    mTextures.add(Texture.loadTextureFromApk("GroundPlane/reticle_midair.png", getAssets()));
    mTextures.add(Texture.loadTextureFromApk("GroundPlane/reticle_interactive_3d.png", getAssets()));
    mTextures.add(Texture.loadTextureFromApk("GroundPlane/shadow.png", getAssets()));
    mTextures.add(Texture.loadTextureFromApk("GroundPlane/reticle_translate.png", getAssets()));
    mTextures.add(Texture.loadTextureFromApk("GroundPlane/reticle_rotate.png", getAssets()));
}

The sample reticle is rendered in both an orthographic (interactive-reticle.png or midair-reticle.png) and perspective (interactive-reticle-3d.png) state. The orthographic rendering is used for mid-air anchoring and also when surface finding has been initialized but a viable surface is not in view. Perspective rendering is then used when a surface is found to indicate the viability of the surface and its position.

Illumination Values

It is possible to use the Vuforia::Illumination class to obtain a scene’s illumination values and use them to render a more realistic augmentation taking into account elements of the environment’s lighting. For more information, refer to the Using Vuforia Fusion Illumination article.

 

Rendering and Customizing Reticles and Indicators (GroundPlaneRenderer.java, line 769)

private void renderReticleWithProjectionMatrix(Matrix44F projectionMatrix, boolean isReticle2D)
{
    int textureIndex = (mCurrentMode == SAMPLE_APP_INTERACTIVE_MODE || mCurrentMode == SAMPLE_APP_FURNITURE_MODE)
            ? PLANE_2D_RETICLE_TEXTURE_INDEX : MIDAIR_RETICLE_TEXTURE_INDEX;

    if (mCurrentMode == SAMPLE_APP_INTERACTIVE_MODE && !isReticle2D)
    {
        textureIndex = PLANE_3D_RETICLE_TEXTURE_INDEX;
    }

    Matrix44F reticleMV = new Matrix44F();
    reticleMV.setData(mReticlePose.getData());

    // We rotate the reticle for it sit on the plane where we intend to render the reticle instead of intersecting it
    float[] reticleTransform = reticleMV.getData();
    Matrix.rotateM(reticleTransform, 0, 90, -1, 0, 0);

    reticleMV.setData(reticleTransform);

    renderPlaneTexturedWithProjectionMatrix(projectionMatrix, reticleMV, mTextures.get(textureIndex).mTextureID[0], false, isReticle2D);
}

Illumination Values

It is possible to use the Vuforia::Illumination class to obtain a scene’s illumination values and use them to render a more realistic augmentation taking into account elements of the environment’s lighting. For more information, refer to the Using Vuforia Fusion Illumination article.