Working with the Camera in Unity

Adjust the camera to deliver in-focus camera frames to the Vuforia Engine. Use the Image class returned by the CameraDevice to use the texture from the camera frames.

Set Focus and Camera Modes

The camera can be adjusted for performance mode and focus modes. Please see Device Performance Optimization on the available modes.

The FocusMode is set by the CameraDevice class. We recommend using the continuous auto focus mode for most scenarios. The CameraMode can be adjusted for performance or quality. In normal use cases, a default mode can be set with:

bool focusModeSet = VuforiaBehaviour.Instance.CameraDevice.SetFocusMode(FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
if (!focusModeSet)
   {
        Debug.Log("Failed to set focus mode" + focusModeSet);
   }

You can configure the autofocus with a script that inherits from Monobehaviour. This registers a callback with the VuforiaBehaviour that will set a focus mode when the Vuforia Engine has started.

void Start()
{
    VuforiaApplication.Instance.OnVuforiaStarted += OnVuforiaStarted;
    VuforiaApplication.Instance.OnVuforiaPaused += OnPaused;
}

private void OnVuforiaStarted()
{
    VuforiaBehaviour.Instance.CameraDevice.SetFocusMode(
        FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
    VuforiaBehaviour.Instance.CameraDevice.SetCameraMode(Vuforia.CameraMode.MODE_DEFAULT);
}

private void OnPaused(bool paused)
{
    if (!paused) // Resumed
    {
        // Set again autofocus mode when app is resumed
        VuforiaBehaviour.Instance.CameraDevice.SetFocusMode(
            FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
    }
}

Access the Camera Image

Use the Image class to access and set the desired image format. Use the images as an OpenGL texture or apply the texture to a Unity material. Register for the desired image format using the Vuforia.PixelFormat declaration.

VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(PixelFormat.RGB888, true);

NOTE: The Vuforia Namespace Reference page lists the available pixel formats.

Call this method after Vuforia Engine has been initialized and started; to this aim, it is recommended to register an OnVuforiaStarted callback in the Start() method of your MonoBehaviour script:

private void OnVuforiaStarted()
{
    PixelFormat pixelFormat = PixelFormat.RGB888;
    bool success = VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(pixelFormat, true);
    
    // Vuforia has started, now register camera image format
    if (success)
    {
        Debug.Log("Successfully registered pixel format " + pixelFormat.ToString());
    }
    else
    {
        Debug.LogError(
            "Failed to register pixel format " + pixelFormat.ToString() +
            "\n the format may be unsupported by your device;" +
            "\n consider using a different pixel format.");
    }
}

We can extend this script after setting the format to retrieve the camera images. Camera images should be taken during thecallback. That way you can ensure that you retrieve the latest camera image that matches the current frame.

Also, make sure that the camera image is not null, since it can take a few frames for the image to become available after registering for an image format.

Unregister the camera image format whenever the application is paused and register it again when the application is resumed.

Attach the script below to an empty GameObject in your AR scene to convert the images to a texture that can be applied to a material:

using UnityEngine;
using Vuforia;

public class CameraImageAccess : MonoBehaviour
{
    #region PRIVATE_MEMBERS

#if UNITY_EDITOR
    PixelFormat mPixelFormat = PixelFormat.RGBA8888; // Editor passes in a RGBA8888 texture instead of RGB888
#else
    PixelFormat mPixelFormat = PIXEL_FORMAT.RGB888; // Use RGB888 for mobile
#endif
    private bool mAccessCameraImage = true;
    private bool mFormatRegistered = false;
    private Texture2D texture;
    #endregion // PRIVATE_MEMBERS

    #region MONOBEHAVIOUR_METHODS
    void Start()
    {
        // Register Vuforia life-cycle callbacks:
        VuforiaApplication.Instance.OnVuforiaStarted += RegisterFormat;
        VuforiaApplication.Instance.OnVuforiaPaused += OnPause;
        VuforiaBehaviour.Instance.World.OnStateUpdated += OnVuforiaUpdated;
    }
    
    void OnDestroy()
    {
        // Unregister Vuforia life-cycle callbacks:
        VuforiaApplication.Instance.OnVuforiaStarted -= RegisterFormat;
        VuforiaApplication.Instance.OnVuforiaPaused -= OnPause;
        VuforiaBehaviour.Instance.World.OnStateUpdated -= OnVuforiaUpdated;
    }
    #endregion // MONOBEHAVIOUR_METHODS

    #region PRIVATE_METHODS
    /// 
    /// Called each time the Vuforia state is updated
    /// 
    void OnVuforiaUpdated()
    {
        if (mFormatRegistered)
        {
            Image image = VuforiaBehaviour.Instance.CameraDevice.GetCameraImage(mPixelFormat);
            image.CopyBufferToTexture(texture);

            Debug.Log(
                "\nImage Format: " + image.PixelFormat +
                "\nImage Size: " + image.Width + " x " + image.Height +
                "\nBuffer Size: " + image.BufferWidth + " x " + image.BufferHeight +
                "\nImage Stride: " + image.Stride + "\n"
            );
            // Apply then the texture to i.e., a material              
        }
    }
    /// 
    /// Called when app is paused / resumed
    /// 
    void OnPause(bool paused)
    {
        if (paused)
        {
            Debug.Log("App was paused");
            UnregisterFormat();
        }
        else
        {
            Debug.Log("App was resumed");
            RegisterFormat();
        }
    }
    /// 
    /// Register the camera pixel format
    /// 
    void RegisterFormat()
    {
        // Vuforia has started, now register camera image format
        bool success = VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(mPixelFormat, true);
        if (success)

        {
            Debug.Log("Successfully registered pixel format " + mPixelFormat.ToString());
            mFormatRegistered = true;
        }
        else
        {
            Debug.LogError(
                "Failed to register pixel format " + mPixelFormat.ToString() +
                "\n the format may be unsupported by your device;" +
                "\n consider using a different pixel format.");
            mFormatRegistered = false;
        }
    }
    /// 
    /// Unregister the camera pixel format (e.g. call this when app is paused)
    /// 
    void UnregisterFormat()
    {
        Debug.Log("Unregistering camera pixel format " + mPixelFormat.ToString());
        VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(mPixelFormat, false);
        mFormatRegistered = false;
    }
    #endregion //PRIVATE_METHODS
}

Use an OpenGL texture

The Image class provides the camera pixels as a byte array. That approach is useful for some image processing tasks, but sometimes it is preferable to obtain the image as an OpenGL texture, for example if you wish to use the texture in a Material applied to a game object and/or to process the texture a shader. You can obtain the image as an OpenGL texture using the approach demonstrated in the OcclusionManagement sample.

  1. Register a Texture2D object to be filled with the camera pixels at each frame instead of letting Vuforia Engine render the camera image natively at each frame, using the VuforiaRenderer.VideoBackgroundTexture API
  2. See the OcclusionManagement sample scripts for an example of this technique.

Accessing the Raw Pixels

For custom processes involving CV (Computer Vision), you can access the raw pixels from the camera with the Image class. See the following example method on retrieving the pixel size:

private byte[] GetPixels(Image image)
{
    var pixelBufferPtr = image.PixelBufferPtr;
    int imageSize = image.Stride * image.Height;
    byte[] pixels = new byte[imageSize];
    System.Runtime.InteropServices.Marshal.Copy(pixelBufferPtr, pixels, 0, imageSize);
    return pixels;
}

Use a Unity texture

The Image class provides the camera pixels as a byte array. That approach is useful for some image processing tasks, but sometimes it is preferable to obtain the image as an Unity Texture2D, for example if you wish to use the texture in a Material applied to a Game Object and/or to process the texture to a shader. You can obtain the image as an Unity texture using the approach demonstrated in the Occlusion Management sample.

The sample script BoxSetUpShader.cs.

  1. Sets up a variable for the occlusion shader.
  2. Registers a Texture2D object to be filled with the camera pixels at each frame instead of letting Vuforia Engine render the camera image natively at each frame, using the VideoBackground.VideoBackgroundTexture API.
  3. Updates the texture to a material based on the viewport parameters.