Model Target Guide View

Vuforia Image

To initialize detection and tracking of a Model Target, Vuforia Engine provides options to the developers to display Guide Views that assist the user during the AR experience to look at the object from an expected angle and distance. Matching the object with the expected pose indicated with the image will initiate the tracking.

In Vuforia Engine, both the image and the angle/distance relative to the object that image represents is called a Guide View. The Guide View is generated together with the Model Target database in the Model Target Generator desktop tool provided by Vuforia. There are different types of Model Target databases and for each database, one or more Guide Views can be created to help detection and tracking of the object. The Guide View is used differently depending on the type of Model Target database.

  • Standard Model Target databases can have one or multiple Guide Views which can be switched between manually. This is useful if you are tracking a large object where only parts of the object are to be tracked or if the user is meant to approach the object from a certain position. Alternatively, multiple Guide Views can be used to carry out a series of tasks chronologically,
  • Advanced Model Target databases can contain multiple Model Targets, each with one or multiple Guide Views. The Vuforia SDK automatically detects the object in view and will display the Guide View closest resembling that object.
  • Advanced Model Target 360 database can contain one Model Target with one Guide View. The Guide View for a Model Target 360 is not displayed on the screen as with the other databases because the object can be detected from any angle. However, the Guide View is used to select a detection range and must be generated along with the Model Target database.

For an indication of how Guide Views work within a typical Model Target application, see Introduction to Model Targets in Unity and the Model Target Test App User Guide.

Choosing a good Guide View

For stable detection and tracking of your object, choose a Guide View position where you have a diagonal view (i.e. an angle where two or three sides of the object can be seen at the same time, as shown below in left image) that includes as much of the object as possible. Try to avoid a position that presents a fronto-parallel view onto the object (i.e., do not choose a view that is "square on" to one side of the object; right image below). In addition, try to avoid a Guide View position that makes the object appear to have many parallel lines/edges.

Vuforia Image

Favorable diagonal Guide View alignment

Vuforia Image

Undesired Guide View with less object visility and parallel lines

NOTE: use the navigation buttons in the Model Target Generator in the left side of the window to navigate around the object when you are choosing your Guide Views.

If your object is very large, it may be difficult to find a Guide View that is near enough to the object and still shows the entirety of the object. If this is the case, try to choose a Guide View that captures the object from an angle with the most unique features. If your object has areas with large flat surfaces, try to avoid these areas, and instead find an angle of view where unique shapes in the object are more apparent.

To aid the process of selecting a good Guide View a dashed-line frame is displayed when in "model view" mode. To set-up the optimal Guide View for your use-case you can switch between different frames during positioning. Landscape, portrait modes for handheld devices and a specific HoloLens mode will take into account the correct field-of-view of the device. The created Model Target databases are device agnostic - but the optimum result can be tuned while using this feature.

Multiple Guide Views

It is possible to define multiple Guide Views for a single Model Target and then switch between these different Guide Views at runtime. For example, you might have a service manual app with repair guides for various parts of a machine, and you want the user to be able to select which part they want to look at from an in-app menu. See the Model Targets API Overview for details on how to add such functionality to your app.

Multiple Guide Views can be added manually by positioning the camera around the object and adding individual Guide Views one-by-one. To set up Guide Views placed at regular positions around the object you can use the 2-views, 4-views, and 6-views buttons. Selecting a pre-set will create the corresponding individual Guide Views automatically. Positions, viewing directions, and ranges set-up using this method are guaranteed to not overlap and ready for training, which is particularly important.

A Guide View can also be set to use the Advanced 360 detection feature for instantaneous detection and tracking from all sides. For it to do so, select the 360˚ button and create one 360° Guide View. And although the image is not needed for detection and tracking, the Guide View is still used to define the recognition range.

See Advanced Model Target Databases and Getting Started with Advanced Model Target 360 for more information and further guidance.

 

Vuforia Image

UX Considerations

Advanced Model Targets does not display Guide Views before the app has detected a Model Target object. Similarly, Advanced Model Target 360 does not utilize Guide Views in the user experience.

Therefore, in the cases where users are unaware of the which objects are trackable or are unsure of how Model Targets work and no Guide View is displayed, we recommend displaying a viewfinder-UX on the screen to encourage users in positioning themselves so that tracking can begin.

A symbolic Guide View may also be employed for Model Targets with 360˚ feature. This can be an icon or a simplified visual of the object that informs the users about the shape of the object.

Have a look at the Vuforia Model Targets sample apps that demonstrates best-practice UX we recommend to use in combination with the different types of Model Targets Databases. Another way is demonstrated in below images.

Vuforia Image

Standard Model Target Sample

Vuforia Image

Advanced Model Target Sample

Vuforia Image

Advanced Model Target 360 Sample

Advanced Model Target Database

With the MTG, you can train Advanced Model Target databases, which allow your app to either automatically switch between the different Guide Views based on the user's position and angle relative to one or more objects or detect and track a single object from any angle.

For example, you might have a marketing app for a new car model, and you want to highlight different features of the car when the user points their device at them. Or, you might have a very large object, and you want the AR experience to be different depending on whether the user is approaching the object from the front or from the back.

Target Recognition Range

For Guide Views to function as expected with Advanced Model Target Databases, you may need to set up the Target Recognition Range for each Guide View. The Target Recognition Range represents the range of positions and relative angles for which a given Guide View is appropriate

  • Advanced Databases use the Target Recognition Range as the volume in which a specific Guide View is recognized. After recognition the actual Guide View image will appear on the screen to support the manual alignment by the user.
  • 360˚ Guide Views use the Target Recognition Range as the position of which the object can be detected at, from a choosen angle and instantly tracked. Therefore, it is possible to have only one Guide View in an Advanced Model Target 360 database. The actual Target Recognition Range is dependent on your settings, it defaults to all-around the vertical axis of the object with a high- to front-on elevation settings.

The Model Target Generator provides defaults that work for many applications, but you may want to edit the defaults to better fit your use case. For example, if you want to present different content depending on whether the user is approaching your object from the front or from the back, you would create two Guide Views and set the Target Recognition Range for the first Guide View to cover approaches from the front side of the object, and for the second Guide to cover approaches the back side of the object.

Setting the Target Recognition Range

You can set the Target Recognition Range for a Guide View using the Model Target Generator app.

Open the Target Recognition Range panel by clicking the small 3D cube icon on the top right side of a Guide View preview image:

Vuforia Image

This will open the Target Recognition Range panel on the right side of the window:

Vuforia Image

Viewing Angles

To configure which Guide View should be activated for a particular viewing angle toward an object, the Model Target Generator app lets you control the range of viewing angles that will activate a given Guide View, by controlling the azimuth (green)elevation (red), and roll (blue) angle ranges relative to the Guide View position. 

For example, with the default range (-45º,+45º) for azimuth (the green section in the image below), this Guide View (represented by the small black cone) will be active only when the user is viewing the object from a position that lies inside the coloured areas, i.e. from an angle that is less than 45º away from the Guide View position:

Model Target Generator Recognition Range UI showing -45 to 45 azimuth range

In contrast, with the maximum range (-180,+180) for azimuth, this Guide View will always activate when the object is in view, regardless of which side you are viewing the object from:

Model Target Generator Recognition Range UI showing -180 to 180 azimuth range

NOTE: both of these cases, the Guide View position (the small gray cone) stays at the same place. This represents the overall position where the user needs to hold their device in order to actually start tracking the object. The Guide View image will be displayed when the system detects that the user is inside the Target Recognition Range, but the user still needs to move their device to the actual Guide View position in order to actually start tracking.

Also, note the databases with multiple Guide Views for your object, it may make sense to set up Target Recognition Ranges so that a different Guide View is active depending on which side the user approaches your object from, or on which component of your object the user is pointing their device at. You can use the 2 views defaut for this.

Alternatively, it may make sense to have the same Guide View activate on all sides of the object, such as if you want the user to move to a particular side of your object so that the AR experience has a consistent entry point (If this is the case, make sure that the Guide View image overlay that gets rendered in your app clearly indicates where the user should stand relative to the object in order to start the AR experience!).

Do not overlap ranges

If you have multiple Guide Views for your object, it is important that the detection ranges do not overlap. The following screenshots from the Model Target Generator App indicate non-overlapping Guide Views:

Vuforia Image
Model Target Generator Recognition Range UI composite showing four guide views with 90º ranges that do not overlap.

In the following example, the azimuth ranges overlap, which means that Vuforia Engine will be unable to reliably select a consistent Guide View for the overlapping range:

Model Target Generator Recognition Range UI composite showing two guide views, the first with a 90º range and the second with a 180º range that overlaps the first.
Vuforia Image

To avoid any ambiguities, there should ideally be a gap between each of the recognition ranges. For example, suppose you have two Guide Views for an object, one covering the front side and one covering the back. If the first Guide View covers azimuth angles from -85º to 85º, then the other Guide View should cover, at most, the corresponding range -85º to 85º from the other side of the object so that there is at least a 10º gap between the edges of each range

Distance

The width (thickness) of an angle range section/ring indicates the range of distances from which the object can be recognized for this detection position. The default range is between 0.75 and 1.5 times the distance from the Guide View to the object. The range can be set either relative to the distance of the detection position or using absolute scene-unit values (which can be useful if you know the exact size of the room your AR app will be run in, and therefore the maximum distance the camera will be from the object).

NOTE: The Target Recognition Range for 360˚ Guide Views is computed to the most suitable distance for the recognition range. This setting cannot be edited manually in the recognition range menu.

Target Extent

The Target Extent bounding box defines which parts of the model will actually be used in recognizing the object and/or discerning between Guide Views. Each Guide View has its own Target Extent bounding box, which you can modify by clicking on Edit Target Extent in the Target Recognition Range UI. 

By default, the Target Extent bounding box covers the entire model. However, you may want to restrict the area of the object that the user can use to activate this particular Guide View or initiate the tracking with the 360˚ Guide View. For example, suppose your object is a car, and you are making a marketing app demonstrating individual features of the car in the trunk area. In this case it might make sense to define a Guide View looking at the trunk of the car and restrict the Target Extent bounding box to just the rear section of the car, and another Guide View looking at the engine compartment of the car and restrict the Target Extent bounding box to just the front section of the car.

NOTE: if you restrict the Target Extent bounding box, the whole of the object will still be used for tracking - the Target Extent controls just the region of the object that can be used to recognize and activate a particular Guide View or initiate the tracking of the Advanced Model Target 360.

Performance Considerations

wider detection range may create more ambiguity when running your app and attempting detection. With that in mind, you will probably want to keep the detection ranges as small as possible, to match your expected usage scenarios.

For example, an object that cannot be viewed from behind (like an object mounted on a wall) doesn't need a full 360º azimuth angle range; and an object that will only be observed from above, and never from below, only needs an elevation range that covers the top half of the object.

Learn More

Model Targets Overview

Model Target Generator User Guide

Advanced Model Target Databases