The Vuforia Core Samples project includes various scenes using Vuforia features including a pre-configured Object Recognition scene that you can use as a reference and starting point for your own Object Recognition apps. This article will show you how to add Object Targets to the sample and how to customize event handling for Object Recognition events.
To import the Object Recognition sample into a new Unity project:
The Vuforia Core Samples includes an Object Target database with a Mars Habitat model. To enable it, load and activate the "VuforiaMars_Object_OT" Database in the "Datasets" section of the Vuforia Configuration window (menu: Window> Vuforia Configuration)
If you would like to use your own objects, you can create an Object Target database using the Vuforia Object Scanner. Read more about the Vuforia Object Scanner here. Once you've created a Device Database containing your Object Target(s), you can import it to your project by selecting Assets > Import Package > Custom Package or simply double clicking the *.unitypackage file on your file system.
Open the "3-ObjectReco" scene from the "SamplesScenes" folder. The sample s"3-ObjectReco" scene Hierarchy demonstrates how to set-up a Vuforia Object Recognition scene in Unity.
The ARCamera GameObject represents both the device camera and scene camera.
Camera Device Mode Setting enables your to prioritize render quality vs frame rate for your app. Selecting MODE_DEFAULT will typically prioritize rendering except on devices with lower performance characteristics.
Max Simultaneous Tracked Objects defines how many targets can be tracked within the camera view at the same time. Object Recognition supports a maximum MSTO value of 2.
Delayed Initialization enables object datasets to be partially loaded to conserve memory when several object targets are in the dataset. This option should will delay detection but reduce memory requirements.
Load Data Set automatically loads the associated dataset from Streaming Assets / QCAR when the app initializes.
Activate automatically activates the dataset after it is loaded.
Note: if you don't load and activate datasets through the Editor, you ll need to do so using the Vuforia API, See: How To Load and Activate Multiple Device Databases at Runtime
The ObjectTarget GameObject encapsulates the Object Target Behaviour and the Object Reco Trackable Event Handler.
Data Set defines the dataset to use for this target instance
Object Target defines which target from the dataset to use
Length dimension value for the Bounding Box
Width dimension value for the Bounding Box
Height dimension value for the Bounding Box
Show Bound Box renders the bounding box of your target in the Unity Editor to facilitate the placement of augmenting media in relation to the physical target
The Object Reco Trackable Event Handler component is responsible for handling callbacks to the Object Target Behaviour arising from changes in the status of the trackable, such as when the target has been detected and is then being tracked. Extend this script to implement custom event handling for your app.
1. Set your platform build target for either IOS, Android in File or UWP in Build Settings.
2. Add your scene(s) to Scenes in Build.
3. Define a unique Bundle ID in Player Settings > Other Settings.
4. Define a unique Product Name to serve as the name of the app when installed on a device.
5. Select Build to generate an executable or Build & Run to both generate an executable and deploy it to a connected device.
There is no need to change any of the default Player Settings to support Object Recognition. You can customize the presentation of your app on the device by adding icons and splash images in Player Settings and setting the app s device orientation.