The Vuforia Object Recognition Sample project provides a pre-configured Object Recognition scene that you can use as a reference and starting point for your own Object Recognition apps. This article will show you how to add Object Targets to the sample and how to customize event handling for Object Recognition events.
Importing the sample
To import the Object Recognition sample into a new Unity project...
Either double click on the objectrecognition-x-x-x.unitypackage to launch the Import Package dialog or import the same Unity package by selecting it from the Unity Editor menu from Assets > Import Package > Custom Package.
Click the Import buttons at the bottom right of the Importing Package window.
To import the Object Recognition sample into an existing Vuforia Unity project..
Follow the steps in the Vuforia Unity project migration guide
Follow the steps to import the Object Recognition sample into a Unity project above.
To import the Object Recognition sample into an existing Unity project that doesn’t already contain the Vuforia extension. .
Save a separate copy of your project.
Either double click on the objectrecognition-4.x.x.unitypackage to launch the Import Package dialog or import the same Unity package by selecting it from the Unity Editor menu from Assets > Import Package > Custom Package.
Review the folder and asset names used in the sample project to determine if they will clash with named assets that you are using in your project, and if so abort the import, rename your assets, and return to step 2.
Once you have ensured that there are no project conflicts, click the Import buttons at the bottom right of the Importing Package window.
Add Device Databases
The Object Recognition sample doesn’t include any predefined Object Targets. You’ll need to create your own using the Vuforia Object Scanner.
Vuforia Object Scanner
Once you’ve created a Device Database containing your Object Target(s), you can import it to your project by selecting Assets > Import Package > Custom Package or simply double clicking the *.unitypackage file on your file system.
Scene elements & their configuration
The sample’s scene Hierarchy demonstrates how to set-up a Vuforia Object Recognition scene in Unity.
The ARCamera prefab represents both the device camera and scene camera.
Camera Device Mode Setting enables your to prioritize render quality vs frame rate for your app. Selecting MODE_DEFAULT will typically prioritize rendering except on devices with lower performance characteristics.
Max Simultaneous Tracked Objects defines how many targets can be tracked within the camera view at the same time. Object Recognition supports a maximum MSTO value of 2.
Delayed Loading Object Data Sets enables object datasets to be partially loaded to conserve memory when several object targets are in the dataset. This option should will delay detection but reduce memory requirements.
World Center Mode defines which object in the scene will serve as the origin ( 0,0,0 ) of the scene’s world space. When SPECIFIC_TARGET is chosen the World Center field is presented, enabling you to select which target is used as the scene origin.
Dataset Load and Activate
Load Data Set automatically loads the associated dataset from Streaming Assets / QCAR when the app initializes.
Activate automatically activates the dataset after it is loaded.
Note: if you don’t load and activate datasets through the Editor, you’ll need to do so using the Vuforia API, See: How To Load and Activate Multiple Device Databases at Runtime
The ObjectTarget prefab encapsulates the Object Target Behaviour and Default Trackable Event Handler.
Object Target Behaviour
Data Set defines the dataset to use for this target instance
Object Target defines which target from the dataset to use
Length dimension value for the Bounding Box
Width dimension value for the Bounding Box
Height dimension value for the Bounding Box
Show Bound Box renders the bounding box of your target in the Unity Editor to facilitate the placement of augmenting media in relation to the physical target
Default Trackable Event Handler
The Default Trackable Event Handler component is responsible for handling callbacks to the Object Target Behaviour arising from changes in the status of the trackable, such as when the target has been detected and is then being tracked. Extend this script to implement custom event handling for your app.
Building and executing the sample
1. Set your platform build target for either IOS or Android in File > Build Settings.
2. Add your scene(s) to Scenes in Build.
3. Define a unique Bundle ID in Player Settings > Other Settings.
4. Define a unique Product Name to serve as the name of the app when installed on a device.
5. Select Build to generate an executable or Build & Run to both generate an executable and deploy it to a connected device.
There is no need to change any of the default Player Settings to support Object Recognition. You can customize the presentation of your app on the device by adding icons and splash images in Player Settings and setting the app’s device orientation.