How To Use the Vuforia Object Recognition Unity Sample

The Vuforia Core Samples project includes various scenes using Vuforia features including a pre-configured Object Recognition scene that you can use as a reference and starting point for your own Object Recognition apps. This article will show you how to add Object Targets to the sample and how to customize event handling for Object Recognition events.

Importing Vuforia Core Samples

To import the Object Recognition sample into a new Unity project:

  1. Go to the asset store and search for the "Vuforia Core Samples" package or click this link.
  2. Download and import the package.
    Note: Vuforia Core Samples form a complete project and will overwrite the current project. We recommend you import Vuforia Core Samples into an empty project
  3. Enable "Vuforia Augmented Reality Supported" in "XR Settings" of Unity's Player Settings.

Load + Enable Object Target Databases

Vuforia Core Samples include an Object Target database with a Mars Habitat model. To enable it, load and activate the "VuforiaMars_Object_OT" database in the "Datasets" section of the Vuforia Configuration window (menu: Window> Vuforia Configuration).

If you would like to use your own objects, you can create an Object Target database using the Vuforia Object Scanner. Read more about the Vuforia Object Scanner here. Once you've created a Device Database containing your Object Target(s), you can import it to your project by selecting Assets > Import Package > Custom Package or simply double clicking the *.unitypackage file on your file system.

Scene elements & their configuration

Open the "3-ObjectReco" scene from the "SamplesScenes" folder. The sample's "3-ObjectReco" scene hierarchy demonstrates how to set up a Vuforia Object Recognition scene in Unity.

Vuforia Image
  • ARCamera - Vuforia ARCamera instance
  • ObjectTarget - ObjectTarget instance
  • Habitat, Astronaut - Augmentation Content
  • CommonUI - User Interface



The ARCamera GameObject represents both the device camera and scene camera.

VuforiaConfiguration (menu: Window> Vuforia Configuration)

Vuforia Image

Camera Device Mode enables your to prioritize render quality vs frame rate for your app. Selecting MODE_DEFAULT will typically prioritize rendering, except on devices with lower performance characteristics.

The Max Simultaneous Tracked Objects setting defines how many targets can be tracked within the camera view at the same time. Object Recognition supports a maximum MSTO value of 2.

Delayed Initialization enables object datasets to be partially loaded to conserve memory when several object targets are in the dataset. This option will delay detection but reduce memory requirements.



Vuforia Image

Load Data Set automatically loads the associated dataset from Streaming Assets / QCAR when the app initializes.

Activate automatically activates the dataset after it is loaded.

Note: If you don't load and activate datasets through the Editor, you'll need to do so using the Vuforia API, See: How To Load and Activate Multiple Device Databases at Runtime


The ObjectTarget GameObject encapsulates the Object Target Behaviour and the Object Reco Trackable Event Handler.

Object Target Behaviour

Vuforia Image

Database defines the dataset to use for this target instance.

Object Target defines which target from the dataset to use.

Length dimension value for the Bounding Box

Width dimension value for the Bounding Box

Height dimension value for the Bounding Box

Show Bound Box renders the bounding box of your target in the Unity Editor to facilitate the placement of augmenting media in relation to the physical target.


Default Trackable Event Handler

The Object Reco Trackable Event Handler component is responsible for handling callbacks to the Object Target Behaviour arising from changes in the status of the trackable, such as when the target has been detected and is then being tracked. Extend this script to implement custom event handling for your app.

Building and executing the sample

1. Set your platform build target for either IOS, Android in File, or UWP in Build Settings.
2. Add your scene(s) to Scenes in Build.
3. Define a unique Bundle ID in Player Settings > Other Settings.
4. Define a unique Product Name to serve as the name of the app when installed on a device.
5. Select Build to generate an executable or Build & Run to both generate an executable and deploy it to a connected device.

There is no need to change any of the default Player Settings to support Object Recognition. You can customize the presentation of your app on the device by adding icons and splash images in Player Settings and setting the app's device orientation.