How To Use Rendering Primitives


RenderingPrimitives and the Renderer classes are the two main components providing AR Rendering support with the Vuforia API. 

The Renderer class is here to support video background texture update, and will use graphics resource (e.g. texture, rendering calls).

On the other hand, the RenderingPrimitives class is renderer-agnostic (only CPU data) and should be used to support all the logic you need for a video-based monocular AR rendering.

The RenderingPrimitives are built on the concept of the “rendering” building blocks -the “primitives” in RenderingPrimitives - that you can easily combine to render your scene. These primitives follow some of the common rendering concepts:

  • Viewport for your augmentation,
  • Projection Matrix for your augmentation,
  • Primitives object for video background rendering,
  • Projection Matrix for video background,
  • Etc.

The RenderingPrimitives are built using information from the Tracking state and also use device information (screen orientation, type of devices). You can see them as the equivalent of the Tracking state for Tracking but for Rendering (Rendering State).


Rendering Primitives Usage for Mobile AR

Mobile AR rendering is technically equivalent to Monocular Video-See through AR rendering. This implies two major steps: rendering video in the background, rendering augmentation in overlay.

In your rendering application, the following workflow can be used to realize that:

  1. Setup your Viewport (via RenderingPrimitives)
  2. Video Background Rendering: orthographic rendering of a quad mesh with a video texture
    1. Update the video background texture from the Tracking State (via Renderer)
    2. Retrieve video background rendering primitives such as video background mesh, video background projection matrix (via RenderingPrimitives)
    3. Render your video background mesh
  3. Augmentation Rendering:
    1. Retrieve projection matrix (via RenderingPrimitives)
    2. Retrieve device pose matrix, invert it to get the View matrix
    3. For each detected trackable, compute the Model matrix
    4. Combine Projection matrix, View matrix, and Model view matrix to render trackable

For all these steps, you can use the Vuforia::Tool class and the MathUtils (sample app) to support additional mathematical transformation.  

Example code of the viewport setup

// 1. Setup the viewport 
// We're writing directly to the screen, so the viewport is relative to the screen 
Vuforia:Vec4I viewport = renderingPrimitives->getViewport(Vuforia::VIEW_SINGULAR);  
// [Graphics API specific code to setup the viewport using OpenGL, Metal or DirectX]  
// 2. Video Background rendering 
  // 2.a Update video background texture 
  // [..] 
  // 2.b Retrieve video background rendering primitives 
  if (drawVideo) 
     Vuforia::Matrix44F vbProjectionMatrix = Vuforia::Tool::convert2GLMatrix(renderingPrimitives->getVideoBackgroundProjectionMatrix(Vuforia::VIEW_SINGULAR)); 
     const Vuforia::Mesh& vbMesh = 
     // 2.c Rendering Video background mesh 
     // Graphics API specific code to render the video background mesh using OpenGL, Metal or DirectX] 
// 3 Augmentation Rendering 

   // 3.1 Retrieve projection matrix 
   // Retrieve the projection matrix to use for the augmentation 
   Vuforia::Matrix44F projectionMatrix = Vuforia::Tool::convertPerspectiveProjection2GLMatrix( 
   renderingPrimitives->getProjectionMatrix(Vuforia::VIEW_SINGULAR, state.getCameraCalibration()), NEAR_PLANE, FAR_PLANE); 

  // 3.2 Compute View Matrix 
  Vuforia::Matrix44F viewMatrix = MathUtils::Matrix44FIdentity(); 
  // read device pose from the state and create view matrix in World CS 
  if (state.getDeviceTrackableResult() != nullptr && state.getDeviceTrackableResult()->getStatus() != Vuforia::TrackableResult::NO_POSE) 
      Vuforia::Matrix44F deviceMatrix = Vuforia::Tool::convertPose2GLMatrix(state.getDeviceTrackableResult()->getPose()); 
      // [invert deviceMatrix to get view matrix] 
      // viewMatrix =  

   // 3.2 Compute Model Matrix for each trackable 
 const auto& trackableResultList = state.getTrackableResults(); 
 for (const auto& trackableResult : trackableResultList) 
     if (trackableResult->isOfType(Vuforia::ImageTargetResult::getClassType())) 
          const Vuforia::ImageTargetResult* itResult = static_cast<const Vuforia::ImageTargetResult*>(trackableResult); 
          const Vuforia::ImageTarget& it = itResult->getTrackable(); 
          Vuforia::Matrix44F modelMatrix = Vuforia::Tool::convertPose2GLMatrix(trackableResult->getPose()); 
          Vuforia::Matrix44F modelViewMatrix; 
          MathUtils::multiplyMatrix(viewMatrix, modelMatrix, modelViewMatrix); 
          // 3.3 render your model using combination of projection matrix and modelViewMatrix  
          // [Graphics API specific code to render the augmentation using OpenGL, Metal or DirectX] 


Using the Render Primitive Sample Class

This class encapsulates the RenderingPrimitives usage, allowing Vuforia Engine to smoothly render migration from old native apps to newer implementation. Changing from the deprecated APIs to the new APIs requires just a few steps. 

SampleAppRendererControl. This interface is implemented by the Sample Renderer class. The SampleAppRenderer class calls this method to render the scene. The method requires two parameters: state to get the trackables results and the projection matrix for the current view.

SampleAppRenderer. The following stores a reference to the rendering primitives so as to avoid updating it every cycle. The value is updated only when a configuration change, such as rotation or going to the background, occurs:

private RenderingPrimitives mRenderingPrimitives = null;

This is used to call renderFrame (State, projectionMatrix) from the Renderer:

private SampleAppRendererControl mRenderingInterface = null; 

These are near and far planes. The plane values are initialized in the constructor but can be changed using setNearFarPlanes(near, far):

private float mNearPlane = -1.0f; 
private float mFarPlane = -1.0f; 

Texture used for the video background rendering:

private GLTextureUnit videoBackgroundTex = null; 

Sets the near and far planes and keeps a reference to the activity and the interface to call renderFrame when required:

public SampleAppRenderer(SampleAppRendererControl renderingInterface,  
                         Activity activity, float nearPlane, float farPlane) 
    Device device = Device.getInstance(); 

On surface created, initialize the shaders for the video background rendering:

public void onSurfaceCreated() 

Called when a screen size change occurs to update the configuration, orientation and dimensions. The rendering primitives are updated to get the correct values for the new configuration: 

public void 
onConfigurationChanged(boolean isARActive) 

    if (!mIsRenderingInit) 
    mIsRenderingInit = true; 

Gets the state to pass it to the renderFrame method:

State state; 
    // Get our current state
    state = TrackerManager.getInstance().getStateUpdater().updateState(); 

Gets the viewport values and get the projection matrix. Calls renderFrame, passing the current state and the resulting projection matrix. Renders the video background if the current view is not the postprocess one:

  Vec4I viewport; 
    viewport = mRenderingPrimitives.getViewport(VIEW.VIEW_SINGULAR); 
    float projectionMatrixGL[] = Tool.convertPerspectiveProjection2GLMatrix( 
                (VIEW.VIEW_SINGULAR, state.getCameraCalibration()), mNearPlane, mFarPlane).getData(); 
        mRenderingInterface.renderFrame(state, projectionMatrix); 

Binds the video background texture provided from the SDK to render it:

int vbVideoTextureUnit = 0; 


if (!mRenderer.updateVideoBackgroundTexture(videoBackgroundTex)) 

Gets the projection matrix specifically to render the video background:

float[] vbProjectionMatrix = Tool.convert2GLMatrix( 

Gets the mesh where the texture previously returned is used to render the video background:

Mesh vbMesh = mRenderingPrimitives.getVideoBackgroundMesh(VIEW.VIEW_SINGULAR); 

After setting the shader, renders the video background, applying the projection matrix returned for that specific view to render it correctly with the right aspect ratio:

// Then, we issue the render call

                      vbMesh.getNumTriangles() * 3,  


Migrate Native Rendering to Use Rendering Primitives

The following steps document the code changes necessary to migrate a project in order to use the Render Primitives API via the SampleAppRenderer class.

  1. Copy SampleAppRenderer and SampleAppRendererControl to the SampleApplication folder.
  2. Add the following imports to the Renderer class:
import com.vuforia.Device; 
import com.vuforia.samples.SampleApplication.SampleAppRenderer; 
import com.vuforia.samples.SampleApplication.SampleAppRendererControl;
  1. Implement SampleAppRendererControl in the Renderer class:
public class MultiTargetRenderer implements GLSurfaceView.Renderer, SampleAppRendererControl 
  1. Add a SampleAppRenderer instance:
SampleAppRenderer mSampleAppRenderer;
  1. At the end of onSurfaceCreated, initialize the instance:
// SampleAppRenderer used to encapsulate the use of RenderingPrimitives setting 
// the device mode AR/VR and stereo mode 
mSampleAppRenderer = new SampleAppRenderer(this, vuforiaAppSession.getVideoMode(), .01f, 100f); 
  1. At the end of onSurfaceChanged, call onConfigurationChanged from the SampleAppRenderer:
// RenderingPrimitives to be updated when some rendering change is done 
  1. Replace the renderFrame call with mSampleAppRenderer.render() in onDrawFrame method:
// Call our function to render content: 

// Call our function to render content from SampleAppRenderer class 
  1. Change renderFrame method for it to use a State and a projectionMatrix:
private void renderFrame() 

// The render function called from SampleAppRendering by using RenderingPrimitives views. 
// The state is owned by SampleAppRenderer which is controlling it's lifecycle. 
// State should not be cached outside this method. 

public void renderFrame(State state, float[] projectionMatrix) 
  1. Change Renderer.getInstance().drawVideoBackground() to mSampleAppRenderer.renderVideoBackground() and remove State state = Renderer.getInstance().begin():
// Clear color and depth buffer 

// Get the state from Vuforia and mark the beginning of a rendering section 
State state = Renderer.getInstance().begin(); 

// Explicitly render the Video Background 

// Renders video background replacing Renderer.DrawVideoBackground() 
  1. Remove viewport usage:
// Set the viewport 
int[] viewport = vuforiaAppSession.getViewport(); 
GLES20.glViewport(viewport[0], viewport[1], viewport[2], viewport[3]);
  1. Use provided projection matrix instead of getting it from vuforiaAppSession:
                  0, vuforiaAppSession.getProjectionMatrix().getData(),  
                  0, modelViewMatrix, 0); 

Matrix.multiplyMM(modelViewProjection, 0, projectionMatrix, 0, modelViewMatrix, 0); 
  1. (The remaining steps are for SampleApplicationSession). Remove mScreenWidth, mScreenHeight, mProjectionMatrix, mViewport and mIsPortrait and all their usages.
  2. Remove updateActivityOrientation(), setProjectionMatrix(), getProjectionMatrix(), getViewport(), configureVideoBackground() and storeScreenDimensions().