This article explains how to use an arbitrary scale in your virtual Unity scene despite Vuforia Engine requiring all targets to be configured accurately in meter scale.
Vuforia Engine requires the size of all targets to be defined in meters. It is important that the size is configured as accurately as possible since it is used to compute the pose of the target – its position and orientation in space.
Vuforia Fusion unites tracking information from the camera feed and the device sensors. If the size of a target is not configured correctly, poses computed from the camera image and the sensors do not match, which will cause various tracking issues.
Virtual Scene Scale Factor is a simple mechanism to transform all your Vuforia Targets into a virtual scene scale you wish to use.
Setting the Physical Scale of a Model Target
The physical size of a target is set in the dataset file, but can also be configured in the Unity inspector for a particular target:
Because targets need to be configured in meter scale, by default, all virtual content used as augmentations also needs to be scaled in meters to match the same size.
This however may cause various issues with your digital assets. For example, you might want to create an AR experience where a virtual character interacts with a toy car model using Vuforia Model Targets.
In this scenario, your virtual content will likely be in a different scale than the virtual content: The toy car in this example is 1:18 in scale, but the digital character is 1.75m tall.
Scaling down the digital assets to match the physical size of the target may be possible, but often causes issues in Unity, in particular if you are using physics or particle systems.
Instead, use the Virtual Scene Factor setting on your Vuforia Targets into whatever virtual scene scale you are using. The setting is in the Vuforia Configuration.
The Virtual Scene Scale Factor configures how many Unity scene units correspond to one meter in the physical world. Poses reported by Vuforia will be transformed accordingly, but scene content will not be scaled.
In our example, we want the poses calculated from our 1:18 toy model to be transformed to a virtual scene that contains assets matching the size of a real-sized car. This is done by setting the Virtual Scene Scale Factor to 18 in the Vuforia Configuration. Your Vuforia Target i.e., Model Targets and Image Targets are automatically adjusted in scale as seen in the screenshot above, but no application content will be changed; this includes the Occlusion Object and Target Representation.
At runtime, the Virtual Scene Scale is automatically applied, allowing you to build your scene in whatever scale you need while tracking the target with the correct physical scale.
NOTE: The virtual scene scale factor is not supported on the HoloLens and Magic Leap devices. On those platforms, head tracking poses are not controlled by Vuforia Engine and all scene content is required to be in meters. Therefore, the Virtual Scene Scale Factor is forced to 1.0.
Setting the Virtual Scene Scale Factor at runtime.
In some cases, it might be necessary to change the Virtual Scene Scale Factor at runtime.
For instance, in a similar example as above, you might be building your AR experience for a real-size car. However, for testing your application, you are using a toy model of the same car.
By calling the following value from any script, you can change the Virtual Scene Scale Factor at runtime to toggle between the real and toy-sized car models:
VuforiaConfiguration.Instance.Vuforia.VirtualSceneScaleFactor = 18f;
NOTE: You will also need to adjust the size of the target at runtime by calling i.e.
Best Practices for Managing Scaling of Model Targets