Working with the Camera in Native

This article explains the role of the camera and its life cycle. Change camera, focus, and exposure modes or get the camera intrinsics for rendering your content in sync to the camera image.

Vuforia Engine handles the camera life cycle automatically and you will not need to explicitly manage it. You can still manage camera configurations via the CameraController API. However, it may be useful to understand the lifecycle and what camera parameters are exposed to you to ensure rendered content aligns well with the camera image.

General Setup

The camera is initialised and deinitialised with the Vuforia Engine lifecycle operations vuEngineStart() and vuEngineStop(). Configurations of the camera such as different modes for camera focus, exposure region, region of interest, video, and the flash are provided with the CameraController. Please refer to the Camera API for more information. See also Vuforia Engine Lifecycle for information on camera permission.

Camera Lifecycle

On most platforms, while Vuforia Engine has access to the camera, other apps on the device cannot access it. To release the camera and allow other applications to access it, you can use vuEngineStop().

The lifecycle of the camera is tied to the Engine lifecycle. When resuming after a stopped session, the camera will use the same camera video mode as it was in the previous session.

For example;

  • Setting the video mode to VU_CAMERA_VIDEO_MODE_PRESET_OPTIMIZE_SPEED is retained when a session is resumed.
  • The focus mode is also retained across Vuforia Engine start and stop with the exception of when the mode has been switched to VU_CAMERA_FOCUS_MODE_FIXED as a result of VU_CAMERA_FOCUS_MODE_TRIGGERAUTO. Then it is set to VU_CAMERA_FOCUS_MODE_CONTINUOUSAUTO when Engine is resumed.
  • Exposure mode is not retained across sessions (stopping/pausing, then starting Engine) and will default to VU_CAMERA_EXPOSURE_MODE_CONTINUOUSAUTO.

Starting your application

  • Request camera permission
  • Create your Engine instance vuEngineCreate()
  • Start the Engine vuEngineStart()
  • Configure the camera with CameraController functions for
    • Camera video mode
    • Focus mode
    • Exposure mode
    • Focus region
    • Exposure region

For rendering

  • Check for available camera frames with vuStateHasCameraFrame()
  • Get vuStateGetCameraIntrinsics from the state.
  • Render on the video background with VuRenderVBBackendType according to the platform. See Rendering in Native for more information.

Stopping application

Stop Vuforia Engine with vuEngineStop() as the camera is tied to the Engine instance.

Camera Intrinsics

When rendering virtual content to augment physical objects, it is important to render it with projection parameters that match the intrinsics of the camera on the device that is being used. Otherwise the virtual content will not be closely aligned. Access the camera’s intrinsics and retrieve a matching projection matrix via VuCameraIntrinsics.


float nearPlane:
float farPlane;
VuRotation rotation = VU_ROTATION_ANGLE_0;
vuCameraIntrinsicsGetProjectionMatrix(intrinsics, nearPlane, farPlane, rotation); 

See the camera intrinsics projection matrix for details on the coordinate system convention and the projection matrix for rendering.

Camera Coordinate System

When no Device Pose Observer is active, the camera coordinate system serves as a reference frame for all observations returned by Vuforia Engine.

Targets that are detected and tracked by Vuforia Engine have a pose (that represents the combination of the position and orientation of the target's local reference frame) with respect to the 3D reference frame of the camera. In Vuforia Engine, the reference frame is defined with the X-axis and the Y-axis aligned with the plane tangent to the given target or frame marker, and with the Z-axis orthogonal to that plane. The camera-reference frame is defined with the Z-axis pointing in the camera-viewing direction, and the X-axis and the Y-axis are aligned with the view plane (with X pointing to the right and Y pointing downward).

The pose is represented by a VuPoseInfo pose VuMatrix44F. It can be obtained using the following call:

vuObservationGetPoseInfo(const VuObservation* observation, VuPoseInfo* poseInfo);

For more information, please see Device Tracking API Overview and in particular Spatial Frame of Reference.

Can this page be better?
Share your feedback via our issue tracker