Adjust the camera to deliver well-focused camera frames to Vuforia Engine. Use the Image class returned by the CameraDevice to access camera frames from your application.
Set Modes for Camera, Focus, and Exposure
Adjust the camera related modes to optimize for performance and changing conditions in the environment.
As long as Vuforia Engine has access to the camera, other apps on the device cannot access it. Release the camera by stopping Vuforia Engine to allow other applications to access it:
VuforiaBehaviour.Instance.enabled = false;
In Unity, this is done automatically when pausing the app.
Camera mode
The CameraMode
can be adjusted for default, speed, or quality from the Camera Device Mode in Vuforia Configuration in the Inspector Window of the ARCamera GameObject.
Alternatively, you can configure the camera mode with a script that inherits from Monobehaviour
. This registers a callback with the VuforiaBehaviour
that will set a camera mode when the Vuforia Engine has started.
void Start()
{
VuforiaApplication.Instance.OnVuforiaStarted += OnVuforiaStarted;
}
private void OnVuforiaStarted()
{
VuforiaBehaviour.Instance.CameraDevice.SetCameraMode(Vuforia.CameraMode.MODE_DEFAULT);
}
Enumerator for CameraMode |
|
MODE_DEFAULT |
Best compromise between speed and quality. |
MODE_OPTIMIZE_SPEED |
Minimize Vuforia Engine impact on the system. |
MODE_OPTIMIZE_QUALITY |
Optimize for better image and tracking quality. Applies a higher resource impact on the system. |
The MODE_DEFAULT is used if nothing else is set.
Change the mode to MODE_OPTIMIZE_SPEED if the Vuforia Engine performance in this mode meets your needs. If you expect the target to be moved continuously, we recommend using the default mode.
Focus mode
The FocusMode
is set by the CameraDevice
class. We recommend using the continuous auto focus mode for most scenarios. Not all devices support all focus modes.
bool focusModeSet = VuforiaBehaviour.Instance.CameraDevice.SetFocusMode(FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
if (!focusModeSet)
{
Debug.Log("Failed to set focus mode" + focusModeSet);
}
Focus mode enum |
Behavior |
FOCUS_MODE_FIXED |
Sets the camera into a fixed focus defined by the camera driver. |
FOCUS_MODE_TRIGGERAUTO |
Triggers a single autofocus operation. After the operation is completed, the focus mode will automatically change to FOCUS_MODE_FIXED. Setting a |
FOCUS_MODE_CONTINUOUSAUTO |
Lets you turn on driver-level continuous autofocus for cameras. This mode is recommended as it guarantees that the camera is always focused on the target. |
FOCUS_MODE_INFINITY |
Sets the camera to infinity, as provided by the camera driver implementation. This mode will set the camera to always focus on the background elements in the scene. (Only supported on UWP and Android without ARCore). |
FOCUS_MODE_MACRO |
Sets the camera to macro mode, as provided by the camera driver implementation. This mode provides a sharp camera image for closeups of approximately 15 cm, rarely used in AR setups. (Not supported on iOS and Magic Leap) |
If nothing else is set, the platform’s default focus mode is used.
Focus region
Allow your users to focus on just a region of the camera frame. Set the focus region with a CameraRegionOfInterest
representing a region-of-interest screen position and an extent that is a percentage of the size of the camera frame’s width and height. Usually this is implemented so that the app user can tap the screen to focus on a particular screen area.
CameraRegionOfInterest
must be followed by setting focus mode to FOCUS_MODE_CONTINUOUSAUTO to trigger the re-focus. After the re-focus, the mode sets itself to FOCUS_MODE_FIXED.
NOTE: Focusing only on a region of the camera view can interfere and degrade detection and tracking of Vuforia targets outside of that region.
First, make sure that the device supports setting the focus region with FocusRegionSupported
while the Engine is running. Then, set the focus region with the regionOfInterest
data structure:
var regionOfInterest = new CameraRegionOfInterest{ ScreenPosition = new Vector2(0.2f, 0.2f), Extent = 0.25};
if(VuforiaBehaviour.Instance.CameraDevice.FocusRegionSupported == true)
{
VuforiaBehaviour.Instance.CameraDevice.SetFocusRegion(regionOfInterest);
}
Reset the focus region with the values Vector2 0.5f, 0.5f
and extent as 1.0f
.
If Vuforia Engine is paused, the focus region will also be reset and revert to its default values.
NOTE: On UWP, setting an extent (For focus and exposure) that is less than 0.05 (5%) throws an error as it is deemed a too small a value by the platform.
Use the get method to return the values settings of the region.
GetFocusRegion();
NOTE: On iOS, only the screen position is returned. The extent (For focus and exposure) is returned as a 0 as iOS changes this value internally. The extent should still be set in the CameraregionOfInterest
.
Exposure mode
Adjust the exposure on the camera frames by setting the exposure mode to correct for lighting settings. Note that not all devices support all available exposure modes.
First, verify that the exposure mode you which to change to is supported on the device:
var exposureMode = ExposureMode.EXPOSURE_MODE_FIXED;
IsExposureModeSupported(exposureMode);
Set the exposure mode while the Vuforia Engine is running with:
VuforiaBehaviour.Instance.CameraDevice.SetExposureMode(ExposureMode.EXPOSURE_MODE_TRIGGERAUTO);
If nothing else is set, the platform’s default exposure mode is used.
Exposure mode |
|
EXPOSURE_MODE_TRIGGERAUTO |
Triggers a single auto-exposure operation. After the operation is completed, the exposure mode will automatically change to EXPOSURE_MODE_FIXED. Setting a |
EXPOSURE_MODE_CONTINUOUSAUTO |
Allows the device to control auto-exposure continuously. This mode is recommended as it guarantees that the camera is always applying exposure corrections on the camera view. |
EXPOSURE_MODE_FIXED |
Sets the exposure at a fixed value defined by the camera driver. |
In most scenarios, the EXPOSURE_MODE_CONTINUOUSAUTO is the recommended mode. Apply other modes when the environment has either strong light or not enough of it to show the target.
Setting or changing the exposure mode, while the Vuforia Engine is running, might take a little longer on certain devices and until the exposure changes.
Exposure region
Allow your users to set the exposure on just a region of the camera frame with the CameraRegionOfInterest
data structure. Set the exposure region with a region-of-interest screen position and an extent that is a percentage of the size of the camera frame’s width and height. CameraRegionOfInterest
must be followed by setting exposure mode to EXPOSURE_MODE_TRIGGERAUTO to trigger the re-exposure. After the re-exposure, the mode sets itself to EXPOSURE_MODE_FIXED.
NOTE: Adjusting the exposure on only a region of the camera view can interfere and degrade detection and tracking of Vuforia targets outside of that region.
First, make sure that the device supports setting the exposure region with ExposureRegionSupported
while Vuforia Engine is running. Then, set the exposure region with the regionOfIntereset
variable:
if(VuforiaBehaviour.Instance.CameraDevice.ExposureRegionSupported == true)
{
VuforiaBehaviour.Instance.CameraDevice.SetExposureRegion(regionOfInterest);
}
Reset the exposure region with the values Vector2 0.5f, 0.5f
and extent as 1.0f
.
NOTE: On iOS, only the screen position is returned. The extent is returned as a 0 as iOS changes this value internally. The extent should still be set in the CameraregionOfInterest
.
If Vuforia Engine is paused, the exposure region will also be reset and revert to its default values.
Use the get method to return the values settings of the region.
GetExposureRegion();
Flash Torch
Poor lighting conditions can significantly affect target detection and tracking. For best results, make sure that there is enough light in your environment so that the scene details and target features are well visible in the camera view.
- Consider that tracking works best in indoor environments, where the lighting conditions are usually more stable and easier to control.
- If your application use case and scenarios require operating in dark environments, consider enabling the device Flash torch (if your device has one).
VuforiaBehaviour.Instance.CameraDevice.SetFlash(true);
Clipping Plane
If you experience that your augmentations disappear at a certain distance from the Image Target, it may be that your far clipping plane (in OpenGL or in the Unity camera settings) needs to be adjusted. This applies especially if you work with large Image Targets viewed from a distance.
In Unity, the near and far clipping planes can be set directly in the ARCamera GameObject's Inspector window or with Unity’s scripting API.
See Rendering in Native for details on adjusting the far clipping plane in native.
Apply a Shader to the Camera Images
The Image class provides the camera pixels as a byte array. This approach is useful for some image processing tasks, but sometimes it is preferable to obtain the image as a Unity Texture2D, as shown in the example above. You can also apply the image to a texture in a Material that is used in a shader.
Access the Camera Image
Use the Vuforia Image
class to access and set the desired camera image format. Use the images as an OpenGL texture or apply the texture to a Unity material. Register for the desired image format using the Vuforia.PixelFormat
declaration.
VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(PixelFormat.RGB888, true);
NOTE: The Vuforia Namespace Reference page lists the available pixel formats.
Call this method after Vuforia Engine was initialized and started; to this aim, it is recommended to register an OnVuforiaStarted
callback in the Start()
method of your MonoBehaviour
script:
private void OnVuforiaStarted()
{
PixelFormat pixelFormat = PixelFormat.RGB888;
bool success = VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(pixelFormat, true);
// Vuforia has started, now register camera image format
if (success)
{
Debug.Log("Successfully registered pixel format " + pixelFormat.ToString());
}
else
{
Debug.LogError(
"Failed to register pixel format " + pixelFormat.ToString() +
"\n the format may be unsupported by your device;" +
"\n consider using a different pixel format.");
}
}
We can extend this script to retrieve the camera images after setting the format, and during a callback. In this way, you ensure that you retrieve the latest camera image that matches the current frame.
Also, make sure that the camera image is not null, since it can take a few frames for the image to become available after registering for an image format.
Unregister the camera image format whenever the Engine is stopped and register it again when the Engine is started again.
Apply camera images to a RawImage GameObject in your Unity scene.
- Create an Empty GameObject and attach the below script CameraImageAccess.
- Create a RawImage GameObject from UI -> RawImage.
- Drag the RawImage GameObject to the public field in the CameraImageAccess script.
using UnityEngine;
using UnityEngine.UI;
using Vuforia;
using Image = Vuforia.Image;
public class CameraImageAccess : MonoBehaviour
{
const PixelFormat PIXEL_FORMAT = PixelFormat.RGB888;
const TextureFormat TEXTURE_FORMAT = TextureFormat.RGB24;
public RawImage RawImage;
Texture2D mTexture;
bool mFormatRegistered;
void Start()
{
// Register Vuforia Engine life-cycle callbacks:
VuforiaApplication.Instance.OnVuforiaStarted += OnVuforiaStarted;
VuforiaApplication.Instance.OnVuforiaStopped += OnVuforiaStopped;
if (VuforiaBehaviour.Instance != null)
VuforiaBehaviour.Instance.World.OnStateUpdated += OnVuforiaUpdated;
}
void OnDestroy()
{
// Unregister Vuforia Engine life-cycle callbacks:
if (VuforiaBehaviour.Instance != null)
VuforiaBehaviour.Instance.World.OnStateUpdated -= OnVuforiaUpdated;
VuforiaApplication.Instance.OnVuforiaStarted -= OnVuforiaStarted;
VuforiaApplication.Instance.OnVuforiaStopped -= OnVuforiaStopped;
if (VuforiaApplication.Instance.IsRunning)
{
// If Vuforia Engine is still running, unregister the camera pixel format to avoid unnecessary overhead
// Formats can only be registered and unregistered while Vuforia Engine is running
UnregisterFormat();
}
if (mTexture != null)
Destroy(mTexture);
}
///
/// Called each time the Vuforia Engine is started
///
void OnVuforiaStarted()
{
mTexture = new Texture2D(0, 0, TEXTURE_FORMAT, false);
// A format cannot be registered if Vuforia Engine is not running
RegisterFormat();
}
///
/// Called each time the Vuforia Engine is stopped
///
void OnVuforiaStopped()
{
// A format cannot be unregistered after OnVuforiaStopped
UnregisterFormat();
if (mTexture != null)
Destroy(mTexture);
}
///
/// Called each time the Vuforia Engine state is updated
///
void OnVuforiaUpdated()
{
var image = VuforiaBehaviour.Instance.CameraDevice.GetCameraImage(PIXEL_FORMAT);
// There can be a delay of several frames until the camera image becomes available
if (Image.IsNullOrEmpty(image))
return;
Debug.Log("\nImage Format: " + image.PixelFormat +
"\nImage Size: " + image.Width + " x " + image.Height +
"\nBuffer Size: " + image.BufferWidth + " x " + image.BufferHeight +
"\nImage Stride: " + image.Stride + "\n");
// Override the current texture by copying into it the camera image flipped on the Y axis
// The texture is resized to match the camera image size
image.CopyToTexture(mTexture, true);
RawImage.texture = mTexture;
RawImage.material.mainTexture = mTexture;
}
///
/// Register the camera pixel format
///
void RegisterFormat()
{
// Vuforia Engine has started, now register camera image format
var success = VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(PIXEL_FORMAT, true);
if (success)
{
Debug.Log("Successfully registered pixel format " + PIXEL_FORMAT);
mFormatRegistered = true;
}
else
{
Debug.LogError("Failed to register pixel format " + PIXEL_FORMAT +
"\n the format may be unsupported by your device;" +
"\n consider using a different pixel format.");
mFormatRegistered = false;
}
}
///
/// Unregister the camera pixel format
///
void UnregisterFormat()
{
Debug.Log("Unregistering camera pixel format " + PIXEL_FORMAT);
VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(PIXEL_FORMAT, false);
mFormatRegistered = false;
}
}
Use an OpenGL texture
Other image processing tasks might require you to obtain the image as an OpenGL texture. You can obtain the image as an OpenGL texture using the approach demonstrated in the OcclusionManagement sample.
- Register a Texture2D object to be filled with the camera pixels at each frame instead of letting Vuforia Engine render the camera image natively at each frame, using the
VuforiaRenderer.VideoBackgroundTexture
API - See the OcclusionManagement sample scripts for an example of this technique.
Accessing the Raw Pixels
For custom processes involving CV (Computer Vision), you can access the raw pixels from the camera with the Image class. See the following example method on retrieving the pixel size:
private byte[] GetPixels(Image image)
{
var pixelBufferPtr = image.PixelBufferPtr;
int imageSize = image.Stride * image.Height;
byte[] pixels = new byte[imageSize];
System.Runtime.InteropServices.Marshal.Copy(pixelBufferPtr, pixels, 0, imageSize);
return pixels;
}