The Vuforia SDK supports the following digital eyewear devices:
|Optical See-Through||Epson Moverio BT-200|
|Video See-Through||Samsung Gear VR|
Vuforia's support for digital eyewear enables the creation of mixed reality AR/VR gaming and educational experiences for video-see through devices like the Samsung Gear VR, and AR experiences for optical see-through devices such as the ODG R-7, the Epson BT-200, and Microsoft HoloLens.
Digital eyewear apps can be developed for Android and UWP using Unity (C#) or natively using Java and C++. Sample projects that demonstrate how to implement the digital eyewear APIs are available in both C# and Java.
Vuforia also provides an eyewear calibration application that generates personalized calibration profiles for end users of optical see-through devices. Custom calibration enables content to be accurately placed on real-world targets by taking into account the user's unique facial geometry. The Calibration Assistant is installed on the device that is to be calibrated for, and supports the creation of multiple user calibration profiles.
Registering digital content accurately against the real world environment is a key challenge when developing for optical see-through devices. This is due to the unique differences in facial geometry and vision among end users. The spatial and optical relationships between the user's eyes and the displays will be slightly different for every user. For this reason, custom calibration is recommended for all users and especially those who wear glasses or contact lenses. The Vuforia SDK uses personalized calibration profiles to optimize the accuracy of static content registration.
Custom user profiles are created using the Vuforia Calibration Assistant. The Calibration Assistant guides users through a simple calibration process that determines the spatial relationships between the user's eyes, the display lens and the device camera.
Working with Digital Eyewear
The workflow for developing digital eyewear apps using the Vuforia Unity extension, or Java and C++ APIs, is very similar to that used to develop mobile apps. The significant difference is that you need to configure the app for stereo rendering. If you are developing a mixed reality AR/VR app, you'll need to implement app logic to transition between the AR and VR modes of the experience you have created.
Stereo viewport rendering enables stereo displays to present realistic 3D experiences.
In order to render the video background for video see-through devices, such as the Gear VR, in stereo you will need to configure your scene to use Background Texture Access. This technique is demonstrated in the Stereo Rendering samples for Unity and Android.
Stereo rendering in Unity
Stereo video background texture rendering is automatically enabled or disabled when the ARCamera is configured for its stereo camera mode, See: Configuring the ARCamera Prefab for Digital Eyewear
Stereo rendering using the Java API
The Stereo Rendering sample for Java will show you how to detect whether your app is running on a digital eyewear device, and how to then configure its video background rendering for stereo. The implementation shown in the sample will enable you to dynamically enable stereo rendering based on the device context of your app. See: Using the Android Stereo Rendering Sample.
Immersive AR/VR experiences can be created using Vuforia's vision tracking in conjunction with rotational tracking provided either by the Rotational Device Tracker or a 3rd party VR SDK. To develop these experiences, you will need to implement app logic that transitions the pose of the scene camera between the app's AR and VR modes. You can also utilize Vuforia targets and Virtual Buttons for interactions in both AR and VR.
These techniques are demonstrated in the AR/VR samples and discussed in Best practices for hybrid VR/AR experiences
Using the Rotational Device Tracker
Using the MixedRealityController in Unity
Designing targets for Digital Eyewear apps
There are two important considerations when designing targets for digital eyewear apps.
1. Target size
Targets are typically farther from the device camera for digital eyewear apps in comparison to mobile apps, where the user is extending the device away from their body. For this reason slightly larger targets are often required to ensure reliable detection and tracking. We recommend a minimum target size of ~150mm square.
2. Target scale
The unit scale for optical see-through digital eyewear apps is set to millimeters. You will need to define your Image Target and Multi-Target widths, and Cylinder Target side lengths, in millimeters in the Target Manager so that your content is scaled as expected when it is rendered. You can do this by providing the true dimension of the target's width or side length in millimeters in the corresponding field within the Add Target dialog in the Target Manager.
Integrating the Oculus SDK
To enable Vuforia to use the Oculus SDK, you'll need to follow the steps in the Developing for the Gear VR article.
Integrating the Cardboard SDKTo enable Vuforia to use the Cardboard SDK, you'll need to follow the steps in the Developing for Google Cardboard article.
The Vuforia Digital Eyewear samples are based on their corresponding mobile device samples and maintain the same design and structure.
- HoloLens - The HoloLens sample shows how to attach an AR experience to an image and enable extended tracking
- Stereo Rendering - The stereo rendering sample for Unity demonstrates how to configure a Vuforia Unity scene for stereo displays
- AR/VR - The Vuforia AR/VR sample for Unity demonstrates how to develop mixed reality AR/VR experiences
Digital Eyewear Unity Samples
Configuring the ARCamera Prefab for Digital Eyewear
- Stereo Rendering - The stereo rendering sample for Java demonstrates how to configure a Vuforia Java app for stereo displays
Using the Android Stereo Rendering Sample