Using the Android Stereo Rendering Sample

The Java Stereo Rendering sample for Android Digital Eyewear demonstrates how to implement stereo display rendering for digital eyewear applications. This sample is based on the Image Targets sample. It extends that sample with rendering logic and a runtime configuration appropriate to optical and video see-through devices. These changes are identified and explained in the following article.

Overview

Developing Vuforia apps for digital eyewear devices is very similar to developing for a phone or tablet. The key differences apply to the rendering of the scene.

1. Render in stereo. Stereo rendering utilizes dual camera projection matrices and two render loops in the renderFrame method to render the scene for each eye.

2. Check whether the app is running in an optical see-through device by calling Eyewear::isSeeThru() and if so, prevent video background rendering.

3. Use background texture access for rendering video background on occluded devices. This technique renders the video background on two textures in the scene.

Running the samples

Before you build and run on the Android device

1. Get a License Key and add it to your app: How To add a License Key to your Vuforia App
2. Ensure that you have Android Studio
3. Ensure the latest Android SDK has been installed, taking a note of its location.
4. Connect the device to your PC/Mac and check that the command adb devices results in a log that shows that it is connected.
5. Follow the steps in Setting up the Android Development Environment to configure Android Studio for Vuforia.

Building and executing the sample

1. Unpack the Vuforia SDK to a sub-directory within your Android Studio workspace.
2. Unpack the DigitalEyewearImageTargets sample into the samples directory of the SDK.
3. Create a directory libs under the sample root and copy build\java\vuforia\Vuforia.jar from the Vuforia SDK for Digital Eyewear into that directory.
4. Import the DigitalEyewearImageTargets project into Android Studio
5.You should now be able to build and run the sample as usual. Please refer to documentation for the Android Eclipse plugin for more details.

Using the Eyewear API

To understand the capabilities of a supported eyewear device first get the singleton instance of the API and then check whether a supported device is detected. You c an then determine whether the device has a stereo or mono display.

Eyewear eyewear = Eyewear.getInstance();// obtain the eyewear singleton
boolean isEyewear = eyewear.isSupportedDeviceDetected();// check for device support
 
if(isEyewear)
{
    // confirm that the device is stereo capable and that stereo rendering is enabled on the device
    boolean isStereoCapable = eyewear.isStereoCapable();
 
    if(isStereoCapable)
    {
        // Force the glasses into stereo mode
        if(!eyewear.isStereoEnabled())
        {
            eyewear.setStereo(true);
        }
    }
}

When the Epson BT-200 or ODG R-7 is in 3D mode, they present the left-half of the screen to the left eye and the right-half of the screen to the right eye. This creates a stereo effect for the user. When drawing augmented content in this context you need to create two viewports and render the content for each eye.

The following code snipped from renderFrame() shows how this is achieved:

Eyewear eyewear = Eyewear.getInstance();
boolean isEyewear = eyewear.isSupportedDeviceDetected();
// Render once for each eye
for(int eyeIdx = 0; eyeIdx < numEyes; eyeIdx++)
{
    Matrix44F projectionMatrix;
 
    if(isEyewear)
    {
        if ( numEyes < 2)
        {
            GLES20.glViewport(0, 0, width, height);
            projectionMatrix = eyewear.getProjectionMatrix(Eyewear.EYEID.EYEID_MONOCULAR);
        }
        else
        {
            // Explicitly add a viewport filling half the screen, so that separate
            // images can be output for left and right eyes
            if(eyeIdx == 0) // left eye
            {
                GLES20.glViewport(0, 0, width / 2, height);
                projectionMatrix =
                    eyewear.getProjectionMatrix(Eyewear.EYEID.EYEID_LEFT);
            }
            else // right eye
            {
                GLES20.glViewport(width / 2, 0, width / 2, height);
                projectionMatrix =
                    eyewear.getProjectionMatrix(Eyewear.EYEID.EYEID_RIGHT);
            }
        }
    }
    else
    {
       // This is a standard mobile device, so use the supplied mProjectionMatrix 
        // and default (fullscreen) viewport
        projectionMatrix = vuforiaAppSession.getProjectionMatrix();
    }
 
    Matrix.multiplyMM(modelViewProjection, 0, projectionMatrix.getData(), 0,
                      modelViewMatrix, 0);
   
    //
    // PUT YOUR RENDERING CODE HERE
    //
}


The Vuforia SDK can also use personalized user calibration data to adjust the projection matrix used for each eye.
See:
How To Use Digital Eyewear Calibration Profiles
Vuforia Calibration App

Setting up stereo rendering using the Vuforia Java API

This section will address the configuration of stereo background rendering and stereo camera projections in the Stereo Rendering sample. You can extend this sample to develop a stereo rendering project of your own, or copy the following code to extend an existing project.


Video Configuration in SampleApplicationSession

// Configures the video mode and sets offsets for the camera's image
    private void configureVideoBackground()
    {
        CameraDevice cameraDevice = CameraDevice.getInstance();
        VideoMode vm = cameraDevice.getVideoMode(CameraDevice.MODE.MODE_DEFAULT);
        
        VideoBackgroundConfig config = new VideoBackgroundConfig();

        // see-through devices do not draw the video background
        if (Eyewear.getInstance().isDeviceDetected() &&
               Eyewear.getInstance().isSeeThru())
        {
            config.setEnabled(false);
        } else
        {
            config.setEnabled(true);
        }

        config.setPosition(new Vec2I(0, 0));
        
        int xSize = 0, ySize = 0;
        if (mIsPortrait)
        {
            xSize = (int) (vm.getHeight() * (mScreenHeight / (float) vm
                .getWidth()));
            ySize = mScreenHeight;
            
            if (xSize < mScreenWidth)
            {
                xSize = mScreenWidth;
                ySize = (int) (mScreenWidth * (vm.getWidth() / (float) vm
                    .getHeight()));
            }
        } else
        {
            xSize = mScreenWidth;
            ySize = (int) (vm.getHeight() * (mScreenWidth / (float) vm
                .getWidth()));
            
            if (ySize < mScreenHeight)
            {
                xSize = (int) (mScreenHeight * (vm.getWidth() / (float) vm
                    .getHeight()));
                ySize = mScreenHeight;
            }
        }
        
        config.setSize(new Vec2I(xSize, ySize));
        
        Log.i(LOGTAG, "Configure Video Background : Video (" + vm.getWidth()
            + " , " + vm.getHeight() + "), Screen (" + mScreenWidth + " , "
            + mScreenHeight + "), mSize (" + xSize + " , " + ySize + ")");
        
        Renderer.getInstance().setVideoBackgroundConfig(config);
        
    }

Setting a recommended framerate for your each digital eyewear device and render context

private boolean configureRenderingFrameRate()
    {             
        // In this example we selected the default preset hint for best Mobile AR Experience
        // See website documentation for more information on the rendering hint modes 
        // relevant to your AR experience.
        int myRenderingOptions = Renderer.FPSHINT_FLAGS.FPSHINT_DEFAULT_FLAGS;	
     
        // Optical see-through devices don't render video background
        if (Eyewear.getInstance().isDeviceDetected() &&
            Eyewear.getInstance().isSeeThru())
        {
            myRenderingOptions = Renderer.FPSHINT_FLAGS.FPSHINT_NO_VIDEOBACKGROUND;
        }
        
        // Retrieve recommended rendering frame rate best on currently configured/enabled vuforia features
        // and selected application hint
        int vuforiaRecommendedFPS =  Renderer.getInstance().getRecommendedFps(myRenderingOptions);
     
        // Use the recommended fps value computed by the sdk

        if (!Renderer.getInstance().setTargetFps(vuforiaRecommendedFPS))
        {
            Log.e(LOGTAG,"Failed to set rendering frame rate to: " + vuforiaRecommendedFPS + " fps");   
            return false;
        }
        else
        {
            Log.i(LOGTAG,"Configured frame rate set to recommended frame rate: " + vuforiaRecommendedFPS + " fps");        
        }   
        return true;
    }

Initializing the Stereo Rendering Activity in StereoRendering.java

// Called when the activity first starts or the user navigates back to an
    // activity.
    @Override
    protected void onCreate(Bundle savedInstanceState)
    {
        Log.d(LOGTAG, "onCreate");
        super.onCreate(savedInstanceState);
        
        vuforiaAppSession = new SampleApplicationSession(this);
        
        startLoadingAnimation();
        mDatasetStrings.add("StonesAndChips.xml");
        
        vuforiaAppSession
            .initAR(this, ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
        
        mGestureDetector = new GestureDetector(this, new GestureListener());
        
        // Load any sample specific textures:
        mTextures = new Vector<Texture>();
        loadTextures();
    }


Rendering configuration

// Initializes AR application components.
    private void initApplicationAR()
    {
        // Create OpenGL ES view:
        int depthSize = 16;
        int stencilSize = 0;
        boolean translucent = Vuforia.requiresAlpha();

        mGlView = new SampleApplicationGLView(this);
        mGlView.init(translucent, depthSize, stencilSize);
        
        mRenderer = new StereoRenderingRenderer(this, vuforiaAppSession);
        mRenderer.setTextures(mTextures);
        mGlView.setRenderer(mRenderer);
        
    }

Configuring stereo viewports

// Called when the surface changed size.
    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height)
    {
        Log.d(LOGTAG, "GLRenderer.onSurfaceChanged width=" + width + " height=" + height);

        DisplayMetrics metrics = new DisplayMetrics();
        Eyewear eyewear = Eyewear.getInstance();

        mActivity.getWindowManager().getDefaultDisplay().getMetrics(metrics);
        Vec2I backgroundSize = Renderer.getInstance().getVideoBackgroundConfig().getSize();
        Vec2I backgroundPos = Renderer.getInstance().getVideoBackgroundConfig().getPosition();

        // If this device is not supported, or it is occluded (that is, we show a video background), then we adopt the
        // standard Vuforia viewport calculation by which we adjust the viewport to match the video aspect ratio.
        if (!eyewear.isDeviceDetected() || !eyewear.isSeeThru())
        {
            viewportPosX = ((metrics.widthPixels - backgroundSize.getData()[0]) / 2) + backgroundPos.getData()[0];
            viewportPosY = ((metrics.heightPixels - backgroundSize.getData()[1]) / 2) + backgroundPos.getData()[1];
            viewportSizeX = backgroundSize.getData()[0];
            viewportSizeY = backgroundSize.getData()[1];
        }
        // This is a supported see-through device, so the viewport needs to match the OpenGL surface size. The device
        // calibration relies on this assumption.
        else
        {
            viewportPosX = 0;
            viewportPosY = 0;
            viewportSizeX= width;
            viewportSizeY = height;
        }

        // Call Vuforia function to handle render surface size changes:
        vuforiaAppSession.onSurfaceChanged(width, height);
    }


Bind textures, shaders and obtain an ortho matrix for digital eyewear

// Function for initializing the renderer.
    private void initRendering()
    {
        mTeapot = new Teapot();
        mRenderer = Renderer.getInstance();
        
        GLES20.glClearColor(0.0f, 0.0f, 0.0f, Vuforia.requiresAlpha() ? 0.0f : 1.0f);
        
        for (Texture t : mTextures)
        {
            GLES20.glGenTextures(1, t.mTextureID, 0);
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, t.mTextureID[0]);
            GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
            GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
            GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, t.mWidth, t.mHeight, 0, GLES20.GL_RGBA,
                GLES20.GL_UNSIGNED_BYTE, t.mData);
        }
        
        shaderProgramID = SampleUtils.createProgramFromShaderSrc(CubeShaders.CUBE_MESH_VERTEX_SHADER,
            CubeShaders.CUBE_MESH_FRAGMENT_SHADER);
        
        vertexHandle = GLES20.glGetAttribLocation(shaderProgramID, "vertexPosition");
        normalHandle = GLES20.glGetAttribLocation(shaderProgramID, "vertexNormal");
        textureCoordHandle = GLES20.glGetAttribLocation(shaderProgramID, "vertexTexCoord");
        mvpMatrixHandle = GLES20.glGetUniformLocation(shaderProgramID,"modelViewProjectionMatrix");
        texSampler2DHandle = GLES20.glGetUniformLocation(shaderProgramID, "texSampler2D");

        vbShaderProgramID = SampleUtils.createProgramFromShaderSrc(BackgroundShader.VB_VERTEX_SHADER,
                BackgroundShader.VB_FRAGMENT_SHADER);

        if (vbShaderProgramID > 0)
        {
            // Activate shader:
            GLES20.glUseProgram(vbShaderProgramID);

            // Retrieve handler for vertex position shader attribute variable:
            vbVertexPositionHandle = GLES20.glGetAttribLocation(vbShaderProgramID, "vertexPosition");

            // Retrieve handler for texture coordinate shader attribute
            // variable:
            vbVertexTexCoordHandle = GLES20.glGetAttribLocation(vbShaderProgramID, "vertexTexCoord");

            // Retrieve handler for texture sampler shader uniform variable:
            vbTexSampler2DHandle = GLES20.glGetUniformLocation(vbShaderProgramID, "texSampler2D");

            // Retrieve handler for projection matrix shader uniform variable:
            vbProjectionMatrixHandle = GLES20.glGetUniformLocation(vbShaderProgramID, "projectionMatrix");

           // Set the orthographic matrix
            vbOrthoProjMatrix = Eyewear.getInstance().getOrthographicProjectionMatrix();

            // Stop using the program
            GLES20.glUseProgram(0);
        }
        
        // Hide the Loading Dialog
        mActivity.loadingDialogHandler.sendEmptyMessage(LoadingDialogHandler.HIDE_LOADING_DIALOG);
        
    }

Render Frame

private void renderFrame()
    {
        Eyewear eyewear = Eyewear.getInstance();
        checkEyewearStereo(eyewear);
        int numEyes = 1;
        if (eyewear.isStereoEnabled())
        {
            numEyes = 2;
        }

        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
        
        State state = mRenderer.begin();

        GLES20.glEnable(GLES20.GL_DEPTH_TEST);
        
        // handle face culling, we need to detect if we are using reflection
        // to determine the direction of the culling
        GLES20.glEnable(GLES20.GL_CULL_FACE);
        GLES20.glCullFace(GLES20.GL_BACK);

        if (Renderer.getInstance().getVideoBackgroundConfig().getReflection() == VIDEO_BACKGROUND_REFLECTION.VIDEO_BACKGROUND_REFLECTION_ON)
        {
            GLES20.glFrontFace(GLES20.GL_CCW); // Back camera
        }
            
        // Render once for each eye
        for (int eyeIdx = 0; eyeIdx < numEyes; eyeIdx++)
        {
            Matrix44F projectionMatrix;

            int eyeViewportPosX = viewportPosX;
            int eyeViewportPosY = viewportPosY;
            int eyeViewportSizeX = viewportSizeX;
            int eyeViewportSizeY = viewportSizeY;

            if (eyewear.isDeviceDetected())
            {
                if (numEyes < 2)
                {
                    projectionMatrix = Eyewear.getInstance().getProjectionMatrix(EYEID.EYEID_MONOCULAR);
                }
                else
                {
                    // Setup the viewport filling half the screen
                    // Position viewport for left or right eye
                    if (eyeIdx == 0) // left eye
                    {
                        eyeViewportSizeX /= 2;
                        projectionMatrix = Eyewear.getInstance().getProjectionMatrix(EYEID.EYEID_LEFT);
                    }
                    else // right eye
                    {
                        eyeViewportPosX = eyeViewportSizeX / 2;
                        eyeViewportSizeX /= 2;
                        projectionMatrix = Eyewear.getInstance().getProjectionMatrix(EYEID.EYEID_RIGHT);
                    }
                }
            }
            else
            {
                // This is a standard mobile device, so use the supplied mProjectionMatrix
                projectionMatrix = vuforiaAppSession.getProjectionMatrix();
            }

            // Set the viewport
            GLES20.glViewport(eyeViewportPosX, eyeViewportPosY, eyeViewportSizeX, eyeViewportSizeY);

            // Don't draw video background on see-thru eyewear
            if (!eyewear.isSeeThru())
            {
                renderVideoBackground(0);
            }
            
            // did we find any trackables this frame?
            for (int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
            {
                TrackableResult result = state.getTrackableResult(tIdx);
                Trackable trackable = result.getTrackable();
                printUserData(trackable);
                Matrix44F modelViewMatrix_Vuforia = Tool.convertPose2GLMatrix(result.getPose());
                float[] modelViewMatrix = modelViewMatrix_Vuforia.getData();
                
                int textureIndex = trackable.getName().equalsIgnoreCase("stones") ? 0 : 1;
                
                // deal with the modelview and projection matrices
                float[] modelViewProjection = new float[16];
                
                Matrix.translateM(modelViewMatrix, 0, 0.0f, 0.0f, OBJECT_SCALE_FLOAT);
                Matrix.scaleM(modelViewMatrix, 0, OBJECT_SCALE_FLOAT, OBJECT_SCALE_FLOAT, OBJECT_SCALE_FLOAT);

                Matrix.multiplyMM(modelViewProjection, 0, projectionMatrix.getData(), 0, modelViewMatrix, 0);
                
                // activate the shader program and bind the vertex/normal/tex coords
                GLES20.glUseProgram(shaderProgramID);
                
                GLES20.glVertexAttribPointer(vertexHandle, 3, GLES20.GL_FLOAT, false, 0, mTeapot.getVertices());
                GLES20.glVertexAttribPointer(normalHandle, 3, GLES20.GL_FLOAT, false, 0, mTeapot.getNormals());
                GLES20.glVertexAttribPointer(textureCoordHandle, 2, GLES20.GL_FLOAT, false, 0, mTeapot.getTexCoords());
                
                GLES20.glEnableVertexAttribArray(vertexHandle);
                GLES20.glEnableVertexAttribArray(normalHandle);
                GLES20.glEnableVertexAttribArray(textureCoordHandle);
                
                // activate texture 0, bind it, and pass to shader
                GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextures.get(textureIndex).mTextureID[0]);
                GLES20.glUniform1i(texSampler2DHandle, 0);
                
                // pass the model view matrix to the shader
                GLES20.glUniformMatrix4fv(mvpMatrixHandle, 1, false, modelViewProjection, 0);
                
                // finally draw the teapot
                GLES20.glDrawElements(GLES20.GL_TRIANGLES, mTeapot.getNumObjectIndex(), GLES20.GL_UNSIGNED_SHORT,
                    mTeapot.getIndices());
                
                // disable the enabled arrays
                GLES20.glDisableVertexAttribArray(vertexHandle);
                GLES20.glDisableVertexAttribArray(normalHandle);
                GLES20.glDisableVertexAttribArray(textureCoordHandle);
            }

            SampleUtils.checkGLError("Render Frame");
        }
        
        GLES20.glDisable(GLES20.GL_DEPTH_TEST);
        
        mRenderer.end();
    }

Checking for digital eyewear

private void checkEyewearStereo(Eyewear eyewear)
    {
        if (eyewear.isDeviceDetected() && eyewear.isStereoCapable())
        {
            mIsEyewear = true;

            // Change the glasses into stereo mode
            if (!eyewear.isStereoEnabled())
            {
                if (eyewear.setStereo(true))
                {
                    // Re-acquire the orthographic projection matrix which will 
                    // have changed now we are in stereo
                    vbOrthoProjMatrix = Eyewear.getInstance().getOrthographicProjectionMatrix();
                }
                else
                {
                    Log.e(LOGTAG, "Error setting device to stereo mode");
                }
            }
        }
        else
        {
            if (mIsEyewear)
            {
                mIsEyewear = false;
                // Re-acquire the orthographic projection matrix which may have changed
                vbOrthoProjMatrix = Eyewear.getInstance().getOrthographicProjectionMatrix();
            }
        }
    }