Working with the Camera in Native

This article explains the role of the camera and its life cycle. Render your content in sync to the camera image and in the correct coordinate system.

Vuforia Engine handles the camera life cycle automatically, and you will not need to do actively manage it if you use the default camera configurations. However, it is useful to understand that life cycle and what parameters are exposed to ensure your rendered virtual content aligns well with the camera image.

General Setup

The camera is initialized and deinitialized with the Vuforia Engine together with vuEngineStart() and vuEngineStop(). Configurations of the camera such as different modes for camera focus, video, and the flash are provided with the CameraController. Please refer to the Camera API for more information.

Camera Lifecycle

The lifecycle of the camera is tied with the Engine lifecycle. When resuming after a stopped session, the camera will use the same camera settings to be set as it was in the previous session. For example, setting the video mode to _SPEED after vuEngineStart(), is retained when a session is resumed.

Starting application

  • Request camera permission
  • Start the vuEngineStart()
  • Configure the camera with CameraController functions for
    • Video mode
    • Focus mode

For rendering

  • Check for available camera frames with vuStateHasCameraFrame()
  • Get vuStateGetCameraIntrinsics from the state.
  • Render on the video background with VuRenderVBBackendType according to the platform. See Rendering in Native for more information.

Stopping application

Stop application with vuEngineConfigSetDestroy() as the camera is tied to the Engine instance.

Camera Intrinsics

When rendering virtual content to augment physical objects, it is important to render it with projection parameters that match the intrinsics of the camera on the device that is being used. Otherwise the virtual content will not be closely aligned. Access the camera’s intrinsics and retrieve a matching projection matrix via VuCameraIntrinsics.

vuCameraIntrinsicsGetFov(intrinsics);
vuCameraIntrinsicsGetMatrix(intrinsics);

float nearPlane:
float farPlane;
VuRotation rotation = VU_ROTATION_ANGLE_0;
vuCameraIntrinsicsGetProjectionMatrix(intrinsics, nearPlane, farPlane, rotation); 

 See the camera intrinsics projection matrix for details on the coordinate system convention and the projection matrix for rendering.

Camera Coordinate System

The camera coordinate system defines a reference frame for all observations returned by Vuforia Engine.

Targets that are detected and tracked by Vuforia Engine have a pose (that represents the combination of the position and orientation of the target's local reference frame) with respect to the 3D reference frame of the camera. In Vuforia Engine, the reference frame is defined with the X-axis and the Y-axis aligned with the plane tangent to the given target or frame marker, and with the Z-axis orthogonal to that plane. The camera-reference frame is defined with the Z-axis pointing in the camera-viewing direction, and the X-axis and the Y-axis are aligned with the view plane (with X pointing to the right and Y pointing downward).

The pose is represented by a VuPoseInfo pose VuMatrix44F. It can be obtained using the following call:

vuObservationGetPoseInfo(const VuObservation* observation, VuPoseInfo* poseInfo);

For more information, please see Device Tracking API Overview and in particular Spatial Frame of Reference.

Can this page be better?
Share your feedback via our issue tracker