Ground Plane Native Sample User Guide

The Vuforia Ground Plane sample for iOS shows how to configure and initialize a Ground Plane app using the Vuforia C++ API, and also how to manage the UX and lifecycle of a Ground Plane experience. 

This sample also demonstrates recommended practices for managing Ground Plane app resources and user experience, and included assets you can use in your own apps UI. 

Index

 

Project Organization

Importing the sample

  1. Download the Vuforia Engine for iOS
  2. Unpack into your development environment
  3. Download the Ground Plane sample for iOS
  4. Unpack into the Samples folder of your Vuforia Engine for iOS directory

 

Note: Ground Plane is supported with Vuforia 7.0.36 and later. It is recommended that you use the Vuforia Engine version corresponding to each Ground Plane sample release to ensure API consistency. Ground Plane is only compatible with devices supported by Platform Enablers (ARKit/ARCore) or devices that have been specifically calibrated by Vuforia. See Ground Plane Supported Devices for a list of officially supported devices.

Project Structure

The Ground Plane sample libraries are found in the Features and SampleApplication folders.  To customize and extend the sample, you may need to modify the following libraries.

GroundPlaneEAGLView

Defines:

  • App Lifecycle
  • Content, UI, and Video Background Rendering
  • OpenGL ES Management

GroundPlaneViewController

Defines

  • App States and Modes
    • Start / Stop
    • Pause / Resume
    • Setting the hit testing and anchor positioning UI
  • Sample Control
    • Smart Terrain Tracker lifecycle

SampleApplicationSession

Defines:

  • Ground Plane Sample Application Lifecycle

 

Note: Sample Application Session is part of the Vuforia Samples Applications Framework and not specific to the Ground Plane feature. You'll encounter this library in all of Vuforia's native iOS samples.

Vuforia Image

Sample Lifecycle

  1. Initialize Vuforia
  2. Start the AR Session
  3. Monitor and Respond to State Changes
  4. Deinitialize Vuforia

 

Bind engine SDK states and Initialize Vuforia in loadView ( GroundPlaneViewController.mm, line 104 )

// we use the iOS notification to pause/resume the AR when the application goes (or come back from) background
[[NSNotificationCenter defaultCenter]
	addObserver:self
	selector:@selector(pauseAR)
	name:UIApplicationWillResignActiveNotification
	object:nil];

 [[NSNotificationCenter defaultCenter]
	addObserver:self
	selector:@selector(resumeAR)
	name:UIApplicationDidBecomeActiveNotification
	object:nil];

// initialize AR
[vapp initAR:Vuforia::GL_20 orientation:[[UIApplication sharedApplication] statusBarOrientation]];

// show loading animation while AR is being initialized
 [self showLoadingAnimation];

 

Start the AR session once initialization is compete. ( SampleApplicationSession.mm, line 415 )

- (bool) startAR:(Vuforia::CameraDevice::CAMERA_DIRECTION)camera error:(NSError **)error {
	CGSize ARViewBoundsSize = [self getCurrentARViewBoundsSize];
    
	// Start the camera.  This causes Vuforia to locate our EAGLView in the view
	// hierarchy, start a render thread, and then call renderFrameVuforia on the
	// view periodically
	if (! [self startCamera: camera viewWidth:ARViewBoundsSize.width andHeight:ARViewBoundsSize.height error:error]) {
		return NO;
	}
	self.cameraIsActive = YES;
	self.cameraIsStarted = YES;

	return YES;
}

 

Once the camera is started, renderFrameVuforia is called to set-up the GL Viewport rendering.

renderFrameVuforia then calls  GroundPlaneEAGLView::renderFrameWithState ( ln. 277), passing it the Vuforia::State and Projection Matrix for the camera and viewport rendering mode ( mono vs. stereo ). 

Initialize Trackers for Ground Plane (GroundPlaneViewController.mm, Iine 447)

// Initialize the application trackers
- (bool) doInitTrackers {
	// Initialize positional device and smart terrain trackers
	Vuforia::TrackerManager& trackerManager = Vuforia::TrackerManager::getInstance();

	Vuforia::DeviceTracker* deviceTracker = static_cast
		(trackerManager.initTracker(Vuforia::PositionalDeviceTracker::getClassType()));
    
	if (deviceTracker == nullptr)
	{
		NSLog(@"Failed to start DeviceTracker.");
		return false;
	}
    
	Vuforia::Tracker* smartTerrain = trackerManager.initTracker(Vuforia::SmartTerrain::getClassType());
	if (smartTerrain == nullptr)
	{
		NSLog(@"Failed to start SmartTerrain.");
		return false;
	}

	return true;
}

Check the Trackable Results obtained from the Tracker state. These will provide poses for the Device and any Anchors. You can use these results to register digital content. (GroundPlaneEAGLView.mm, Line 339)

 

if( state.getNumTrackableResults() == 0 ) {
        SampleApplicationUtils::checkGlError("Render Frame, no trackables");
         NSLog(@"No trackables");
    }
    else {
        Vuforia::Matrix34F devicePoseTemp;
        BOOL renderAstronaut = NO;
        BOOL renderDrone = NO;

        for (int i = 0; i < state.getNumTrackableResults(); ++i) {
            // Get the trackable
            const Vuforia::TrackableResult* result = state.getTrackableResult(i);
            
            Vuforia::Matrix44F modelViewMatrix = Vuforia::Tool::convertPose2GLMatrix(result->getPose());
             if(result->isOfType(Vuforia::DeviceTrackableResult::getClassType())) {
                devicePoseTemp = result->getPose();
                mDevicePoseMatrix = SampleApplicationUtils::Matrix44FTranspose(SampleApplicationUtils::Matrix44FInverse(modelViewMatrix));
                mIsDeviceResultAvailable = YES;
            } else if(result->isOfType(Vuforia::AnchorResult::getClassType())) {
                mIsAnchorResultAvailable = YES;
                mAnchorResultsCount ++;
                
                if(!strcmp(result->getTrackable().getName(), HIT_TEST_ANCHOR_NAME))
                {
                    renderAstronaut = YES;
                    mHitTestPoseMatrix = modelViewMatrix;
                }
                
                if(!strcmp(result->getTrackable().getName(), MID_AIR_ANCHOR_NAME))
                {
                    renderDrone = YES;
                    mMidAirPoseMatrix = modelViewMatrix;
                }
            }

Now render the result (GroundPlaneEAGLView.mm, Line 481)

 

        // If we have a device and anchor results we translate and scale the models so they are positioned at the center of the reticle
        // For the drone we want to align it also vertically since it is in mid-air
        if(mIsDeviceResultAvailable && mIsAnchorResultAvailable)
        {
            if (renderAstronaut)
            {
                Vuforia::Matrix44F astronautMV = mHitTestPoseMatrix;
                SampleApplicationUtils::translatePoseMatrix(-0.375f, 0, 0, astronautMV);
                SampleApplicationUtils::scalePoseMatrix(10.0f, 10.0f, 10.0f, astronautMV);
                
                [self renderModelV3D:mAstronaut withPoseMatrix:astronautMV projectionMatrix:projectionMatrix andTextureId:0];
            }
            
            if(renderDrone)
            {
                Vuforia::Matrix44F droneMV = mMidAirPoseMatrix;
                SampleApplicationUtils::translatePoseMatrix(-0.75f, -0.375f, 0, droneMV);
                SampleApplicationUtils::scalePoseMatrix(10.0f, 10.0f, 10.0f, droneMV);
                
                [self renderModelV3D:mDrone withPoseMatrix:droneMV projectionMatrix:projectionMatrix andTextureId:1];
            }
            if(renderFurniture)
            {
                float furnitureMVPMatrix[16];
                
                Vuforia::Matrix44F furnitureMV = mFurniturePoseMatrix;
                Vuforia::Matrix44F shadowMV = mFurniturePoseMatrix;
                Vuforia::Matrix44F gesturesIndicatorMV = mFurniturePoseMatrix;
                
                SampleApplicationUtils::rotatePoseMatrix(mProductRotation, 0, 1.0f, 0, furnitureMV);
                SampleApplicationUtils::scalePoseMatrix(mProductScale, mProductScale, mProductScale, furnitureMV);
                Vuforia::Matrix44F poseMatrix = SampleApplicationUtils::Matrix44FIdentity();
                SampleApplicationUtils::multiplyMatrix(mDevicePoseMatrix, furnitureMV, poseMatrix);
                
                SampleApplicationUtils::multiplyMatrix(&projectionMatrix.data[0], &poseMatrix.data[0], &furnitureMVPMatrix[0]);
       
                float shadowScale = mProductScale * .5f;
                SampleApplicationUtils::rotatePoseMatrix(-90, 1.0, 0.0f, 0, shadowMV);
                SampleApplicationUtils::scalePoseMatrix(shadowScale, shadowScale, shadowScale, shadowMV);

                SampleApplicationUtils::rotatePoseMatrix(-90, 1.0, 0.0f, 0, gesturesIndicatorMV);
                SampleApplicationUtils::scalePoseMatrix(mProductScale, mProductScale, mProductScale, gesturesIndicatorMV);
                
                // Disable depth test so the shadow does not oclude the furniture
                glDisable(GL_DEPTH_TEST);
                [self renderPlaneTexturedWithProjectionMatrix:projectionMatrix MV:shadowMV textureHandle:SHADOW_TEXTURE_INDEX substractColor:YES is2DRender:NO];
                
                if(mProductPlacementState != ProductPlacementStates::IDLE)
                {
                    [self renderPlaneTexturedWithProjectionMatrix:projectionMatrix MV:gesturesIndicatorMV
                                                    textureHandle:(mProductPlacementState == ProductPlacementStates::TRANSLATING) ? TRANSLATE_TEXTURE_INDEX : ROTATE_TEXTURE_INDEX
                                                   substractColor:NO
                                                       is2DRender:NO];
                }

                glEnable(GL_DEPTH_TEST);
                
                float lightColor[] = {mAmbientLightIntensity, mAmbientLightIntensity, mAmbientLightIntensity, 1.0f};
                [mFurniture setLightColor:lightColor];
                [mFurniture renderWithModelView:&furnitureMV.data[0] modelViewProjMatrix:&furnitureMVPMatrix[0]];
            }
        }

 

Vuforia is deinitialized when viewWillDisappear is called (GroundPlaneViewController.mm, Line 174)

 

- (void)viewWillDisappear:(BOOL)animated
{
    // on iOS 7, viewWillDisappear may be called when the menu is shown
    // but we don't want to stop the AR view in that case
    if (self.showingMenu) {
        return;
    }
    
    [vapp stopAR:nil];
    
    // Be a good OpenGL ES citizen: now that Vuforia is paused and the render
    // thread is not executing, inform the root view controller that the
    // EAGLView should finish any OpenGL ES commands
    [self finishOpenGLESCommands];
    
    GroundPlaneAppDelegate *appDelegate = (GroundPlaneAppDelegate*)[[UIApplication sharedApplication] delegate];
    appDelegate.glResourceHandler = nil;
    
    [super viewWillDisappear:animated];
}

This is the correct way to stop and deinitialize the Vuforia Engine for iOS (SampleApplicationSession.mm, Line 430)

 

// Stop Vuforia camera
- (bool)stopAR:(NSError **)error {
    // Stop the camera
    if (self.cameraIsActive) {
        // Stop and deinit the camera
        Vuforia::CameraDevice& cameraDevice = Vuforia::CameraDevice::getInstance();
        cameraDevice.stop();
        cameraDevice.deinit();
        self.cameraIsActive = NO;
    }
    self.cameraIsStarted = NO;

    // ask the application to stop the trackers
    if(! [self.delegate doStopTrackers]) {
        [self NSErrorWithCode:E_STOPPING_TRACKERS error:error];
        return NO;
    }
    
    // ask the application to unload the data associated to the trackers
    if(! [self.delegate doUnloadTrackersData]) {
        [self NSErrorWithCode:E_UNLOADING_TRACKERS_DATA error:error];
        return NO;
    }
    
    // ask the application to deinit the trackers
    if(! [self.delegate doDeinitTrackers]) {
        [self NSErrorWithCode:E_DEINIT_TRACKERS error:error];
        return NO;
    }
    
    // Pause and deinitialise Vuforia
    Vuforia::onPause();
    Vuforia::deinit();
    
    return YES;
}

Deinitialize the trackers used by ground plane (GroundPlaneViewController.mm, Line 578)

- (bool) doDeinitTrackers {
    Vuforia::TrackerManager& trackerManager = Vuforia::TrackerManager::getInstance();
    trackerManager.deinitTracker(Vuforia::SmartTerrain::getClassType());
    trackerManager.deinitTracker(Vuforia::PositionalDeviceTracker::getClassType());
    return YES;
}

Anchor Creation, Event Handling and UI Customization

 

Generating Anchor Points by Hit Testing 

See: GroundPlaneEAGLView.mm, ln. 771 in performHitTestWithNormalizedTouchPointX

  Vuforia::Vec2F hitTestPoint(normalizedTouchPointX, normalizedTouchPointY);
    Vuforia::SmartTerrain::HITTEST_HINT hitTestHint = Vuforia::SmartTerrain::HITTEST_HINT_NONE; // hit test hint is currently unused
    
    // A hit test is performed for a given State at normalized screen coordinates.
    // The deviceHeight is an developer provided assumption as explained on
    // definition of DEFAULT_HEIGHT_ABOVE_GROUND.
    smartTerrain->hitTest(state, hitTestPoint, deviceHeightInMeters, hitTestHint);

    if (smartTerrain->getHitTestResultCount() > 0)
    {
        // Use first HitTestResult
        const Vuforia::HitTestResult* hitTestResult = smartTerrain->getHitTestResult(0);
        
        if (createAnchor)
        {
            if(mCurrentMode == SAMPLE_APP_INTERACTIVE_MODE)
            {
                // Destroy previous hit test anchor if needed
                if (mHitTestAnchor != nullptr)
                {
                    NSLog(@"Destroying hit test anchor with name '%s'", HIT_TEST_ANCHOR_NAME);
                    bool result = deviceTracker->destroyAnchor(mHitTestAnchor);
                    NSLog(@"%s hit test anchor", (result ? "Successfully destroyed" : "Failed to destroy"));
                }
                
                mHitTestAnchor = deviceTracker->createAnchor(HIT_TEST_ANCHOR_NAME, *hitTestResult);
                if (mHitTestAnchor != nullptr)
                {
                    NSLog(@"Successfully created hit test anchor with name '%s'", mHitTestAnchor->getName());
                    
                    if(mAnchorResultsCount == 0)
                        [self.uiUpdater setMidAirModeEnabled:YES];
                }
                else
                {
                    NSLog(@"Failed to create hit test anchor");
                }
            }
            else if(mCurrentMode == SAMPLE_APP_FURNITURE_MODE)
            {
                // Destroy previous hit test anchor if needed
                if (mFurnitureAnchor != nullptr)
                {
                    NSLog(@"Destroying hit test anchor with name '%s'", FURNITURE_ANCHOR_NAME);
                    bool result = deviceTracker->destroyAnchor(mFurnitureAnchor);
                    NSLog(@"%s hit test anchor", (result ? "Successfully destroyed" : "Failed to destroy"));
                }
                
                mFurnitureAnchor = deviceTracker->createAnchor(FURNITURE_ANCHOR_NAME, *hitTestResult);
                if (mFurnitureAnchor != nullptr)
                {
                    NSLog(@"Successfully created hit test anchor with name '%s'", mFurnitureAnchor->getName());
                    
                    if(mAnchorResultsCount == 0)
                        [self.uiUpdater setMidAirModeEnabled:YES];
                    
                    [mFurniture setTransparency:1.0f];
                }
                else
                {
                    NSLog(@"Failed to create hit test anchor");
                }

                mIsFurnitureBeingDragged = NO;
            }
        }
        
        if(mCurrentMode == SAMPLE_APP_FURNITURE_MODE)
            mFurnitureTranslationPoseMatrix = Vuforia::Tool::convertPose2GLMatrix(hitTestResult->getPose());

        mReticlePose = Vuforia::Tool::convertPose2GLMatrix(hitTestResult->getPose());
        mIsAnchorResultAvailable = YES;
        return YES;
    }
    else
    {
        NSLog(@"Hit test returned no results");
        return NO;
    }
}

Creating Anchor Points in Mid-Air

See: GroundPlaneEAGLView.mm, line 733

- (void) createMidAirAnchorWithPose:(Vuforia::Matrix34F&) anchorPoseMatrix
{
    NSLog(@"createMidAirAnchor");
    
    Vuforia::TrackerManager& trackerManager = Vuforia::TrackerManager::getInstance();
    Vuforia::PositionalDeviceTracker* deviceTracker = static_cast (trackerManager.getTracker(Vuforia::PositionalDeviceTracker::getClassType()));

    if(mMidAirAnchor != nullptr)
    {
        NSLog(@"Destroying hit test anchor with name '%s'", MID_AIR_ANCHOR_NAME);
        bool result = deviceTracker->destroyAnchor(mMidAirAnchor);
        NSLog(@"%s hit test anchor", (result ? "Successfully destroyed" : "Failed to destroy"));
    }
    
    mMidAirAnchor = deviceTracker->createAnchor(MID_AIR_ANCHOR_NAME, anchorPoseMatrix);
    
    if (mMidAirAnchor != nullptr)
    {
        NSLog(@"Successfully created hit test anchor with name '%s'", mMidAirAnchor->getName());
    }
    else
    {
        NSLog(@"Failed to create hit test anchor");
    }
}

Rendering and Customizing Reticles and Indicators

See: GroundPlaneEAGLView.mm, line 60

UI Textures are defined in the textureFilenames array and loaded from the /Resources/Assets folder

    const char* textureFilenames[] = {
        "astronaut.png",
        "drone.png",
        "interactive-reticle.png",
        "midair-reticle.png",
        "interactive-reticle-3d.png",
        "shadow.png",
        "translate-reticle.png",
        "rotate-reticle.png",
    };
    
    const char* MID_AIR_ANCHOR_NAME = "midAirAnchor";
    const char* HIT_TEST_ANCHOR_NAME = "hitTestAnchor";
    const char* FURNITURE_ANCHOR_NAME = "furnitureAnchor";

The sample reticle is rendered in both an orthographic ( interactive-reticle.png  or midair-reticle.png ) and perspective (interactive-reticle-3d.png) state. The orthographic rendering is used for mid-air anchoring and also when surface finding has been initialized but a viable surface is not in view. Perspective rendering is then used when a surface is found to indicate the viability of the surface and its position.

See: GroundPlaneEAGLView.mm, ln. 612

- (void)renderReticleWithProjectionMatrix:(Vuforia::Matrix44F&)projectionMatrix isReticle2D:(BOOL)isReticle2D
{
    
    const int PLANE_2D_RETICLE_TEXTURE_INDEX = 2;
    const int MIDAIR_RETICLE_TEXTURE_INDEX = 3;
    const int PLANE_3D_RETICLE_TEXTURE_INDEX = 4;
    
    unsigned int textureIndex = mCurrentMode == SAMPLE_APP_INTERACTIVE_MODE ? PLANE_2D_RETICLE_TEXTURE_INDEX : MIDAIR_RETICLE_TEXTURE_INDEX;
    
    if(mCurrentMode == SAMPLE_APP_INTERACTIVE_MODE && !isReticle2D)
        textureIndex = PLANE_3D_RETICLE_TEXTURE_INDEX;
    
    Vuforia::Matrix44F reticleMV = mReticlePose;
    SampleApplicationUtils::scalePoseMatrix(.25f, 0.25f, .25f, reticleMV);

    // We rotate the reticle for it sit on the plane where we intend to render the reticle instead of intersecting it
    SampleApplicationUtils::rotatePoseMatrix(90, -1, 0, 0, reticleMV);

    [self renderPlaneTexturedWithProjectionMatrix:projectionMatrix MV:reticleMV textureHandle:augmentationTexture[textureIndex].textureID substractColor:NO is2DRender:isReticle2D];
}

Accessing Light Estimation

Light estimation values are provided by underlying technologies like ARKit. The sample demonstrates how to incorporate these values in the renderFrameWithState method on line 444

        auto illumination = state.getIllumination();
        if (illumination != nullptr)
        {
            float ambientIntensity = illumination->getAmbientIntensity();
            if (ambientIntensity != Vuforia::Illumination::AMBIENT_INTENSITY_UNAVAILABLE)
            {
                // We set the model lighting considering the min and max lumens values in the sample to 200 - 1500
                // Low brightness is consider in this case to be 200 lumens which could be a dimmed light bulb and
                // as a bright standard light bulb 1500 lumens, higher values can be obtained from brighter sources
                mAmbientLightIntensity = fmin(1.0f, .1f + (ambientIntensity - 200.0f) / 1300.0f);
            }
        }