Integrating Gear VR and the AR/VR Sample in Unity 5.3 and above

Unity 5.1 introduced built-in support for Virtual Reality (VR) applications and further improvements introduced in Unity 5.3 have enabled Vuforia to take advantage of this built-in VR capability as well. These changes make it easy to develop AR/VR apps in Unity and greatly reduce the coding required to integrate supported VR SDKs.

Supported Devices:

Unity Support Notes: There is a known issue with Unity 5.3.3p2-p3 and Unity 5.3.4f1 which will show a black screen when combining Unity's VR support option and Vuforia 5.0 & 5.5. A workaround is posted here.

Vuforia Integration: Vuforia 5.0 and 5.5 are able to integrate with Unity 5.3's built-in VR functionality. This enables developers to build Vuforia AR/VR apps for the Gear VR without requiring the Oculus SDK. There are two approaches:

  1. Unity VR: Enable Unity's VR and build a simple camera rig and link it to the ARCamera.
  2. Unity VR with Oculus Utilities for Unity 5: Enable Unity's VR and link the OVRCameraRig prefab to the ARCamera and have access to additional Oculus-specific APIs.

Technique #1. Using Vuforia with Unity 5.3's Built-in VR Functionality

1. Create a new Unity project and import the arvr-x-y-z.unitypackage
2. Open the Vuforia-3-AR-VR scene
3. Open the Android Player Settings (File > Build Settings... > Player Settings) and enable the following checkbox options:
  • Multithreaded Rendering
  • Virtual Reality Supported
4. Create the following camera rig in the Vuforia-3-AR-VR scene using the hierarchical outline below:
GameObject
Type Position
CameraRig eGO x=0, y=2, z=-1
LeftCamera Cam x=0, y=0, z=0
TrackableParent eGO x=0, y=0, z=0
ImageTargetStones GO *
VuforiaCenterAnchor eGO x=0, y=0, z=0
RightCamera Cam x=0, y=0, z=0
eGO = Empty GameObject, Cam = Camera
* Make all targets (i.e. Trackables) in the scene children of the TrackableParent object. In the AR-VR sample, the Trackable is the ImageTargetStones GameObject.
5. Now configure the following GameObjects:
  • LeftCamera: TargetEye = Left
  • RightCamera: TargetEye = Right
  • LeftCamera & RightCamera:
    • Clear Flags = Solid Color
    • Background = Black
    • Clipping Planes [Near] = 0.05 *
    • Clipping Planes [Far] = 300 *
* We recommend Near = 0.05 and Far = 300 for AR/VR scenes, but you can adjust to suit your 3D scene.
6. Attach the VRIntegrationHelper.cs script to both the LeftCamera and RightCamera
7. On the LeftCamera, enable the checkbox Is Left and drag the TrackableParent object onto the Trackable Parent property in the Inspector.

Technique #2. Using Vuforia with Unity 5.3's Built-in VR Functionality and Oculus Utilities

1. Download the Oculus Utilities for Unity 5
2. Create a new Unity project and import the following:
  • arvr-x-y-z.unitypackage
  • OculusUtilities.unitypackage
3. Open the Vuforia-3-AR-VR.unity scene
4. Open the Android Player Settings (File > Build Settings... > Player Settings) and enable the following checkbox options:
  • Multithreaded Rendering
  • Virtual Reality Supported
5. Drag an instance of OVRCameraRig to the Hierarchy and position at [0, 2, -1].
6. Now configure the CenterEyeAnchor fields:
  • Clear Flags = Solid Color
  • Background = Black
  • Clipping Planes [Near] = 0.05 *
  • Clipping Planes [Far] = 300 *
  • TargetEye = Left
7. Duplicate the CenterEyeAnchor, name it CenterRightEyeAnchor, and set the following:
  • TargetEye = Right
* We recommend Near = 0.05 and Far = 300 for AR/VR scenes, but you can adjust to suit your 3D scene.
8. Add two new Empty GameObjects (VuforiaCenterAnchor & TrackableParent) to the existing OVRCameraRig in the Vuforia-3-AR-VR scene as shown in the partial hierarchical outline below (screenshot shows full rig hierarchy):
GameObject
Type Position
TrackingSpace    
CenterEyeAnchor    
TrackableParent eGO x=0, y=0, z=0
ImageTargetStones GO *
VuforiaCenterAnchor eGO x=0, y=0, z=0
eGO = Empty GameObject
* Make all targets (i.e. Trackables) in the scene children of the TrackableParent object. In the AR-VR sample, the Trackable is the ImageTargetStones GameObject.
9. Attach the VRIntegrationHelper.cs script to both the CenterEyeAnchor and CenterRightEyeAnchor
10. On the CenterEyeAnchor, enable the checkbox Is Left and drag the TrackableParent object onto the Trackable Parent property in the Inspector.

Camera Rig Binding Steps for the Two Integration Techniques

Vuforia can support 3rd party camera rigs by binding them to the ARCamera. Choose the binding steps which are appropriate to your Unity VR project:

Camera Binding Screenshots Vuforia 5.0, 5.5, 6.2 Camera Binding Steps
Oculus Binding with Vuforia 5.0

Native Binding with Vuforia 5.0

Native Binding with Vuforia 5.5
1. Select the ARCamera in the Hierarchy
2. If using Vuforia 5.0, in the Inspector options for VuforiaBehaviour, set the following:
  • Bind Alternate Camera = checked
  • Synchronize Pose Updates = checked (enables the synchronization of head pose updates between Vuforia and the Gear VR trackers)
  • Skew Frustum = checked
  • Camera Offset = 0
  • Viewer = Gear VR
2. If using Vuforia 5.5, in the Inspector options for DigitalEyewearBehaviour, set the following properties:
  • Eyewear Type = Video See-Through
  • Stereo Camera Config = Gear VR (Oculus)
2. If using Vuforia 6.2, in the VuforiaConfiguration asset set the following properties:
  • Eyewear Type = Video See-Through
  • Stereo Camera Config = Gear VR (Oculus)
3. Drag the following GameObjects in the Hierarchy to the respective fields:
  • VuforiaCenterAnchor to Central Anchor Point
  • CenterEyeAnchor (Oculus) to Left Camera
  • CenterRightEyeAnchor (Oculus) to Right Camera
  • LeftCamera (Native) to Left Camera
  • RightCamera (Native) to Right Camera
4. Click the Add Vuforia Components button that will appear under the Left Camera and also under Right Camera fields in the Inspector.

Additional Build Details

  • Place Oculus Signature Files (OSIGs) in the following Unity project location: Assets/Plugins/Android/assets/
  • Include the Vuforia-3-AR-VR scene in Build Settings
  • Omit the Vuforia-1-About scene from Build Settings

Workaround for Unity 5.3.3p2-p3 and Unity 5.3.4f1

Open the VRIntegrationHelper.cs script located in Assets/Vuforia/Scripts/Utilities and replace the code in OnPreRender() with the following:

void OnPreRender()
{
    // on pre render is where projection matrix and pixel rect 
    // are set up correctly (for each camera individually)
    // so we use this to acquire this data.
    
    if (IsLeft && !mLeftCameraDataAcquired)
    {
        if (
            !float.IsNaN(mLeftCamera.projectionMatrix[0,0]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[0,1]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[0,2]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[0,3]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[1,0]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[1,1]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[1,2]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[1,3]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[2,0]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[2,1]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[2,2]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[2,3]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[3,0]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[3,1]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[3,2]) &&
            !float.IsNaN(mLeftCamera.projectionMatrix[3,3])
           )
        {
            mLeftCameraMatrixOriginal = mLeftCamera.projectionMatrix;
            mLeftCameraPixelRect = mLeftCamera.pixelRect;
            mLeftCameraDataAcquired = true;
        }
    }
    else if (!mRightCameraDataAcquired)
    {
        if (
            !float.IsNaN(mRightCamera.projectionMatrix[0,0]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[0,1]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[0,2]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[0,3]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[1,0]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[1,1]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[1,2]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[1,3]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[2,0]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[2,1]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[2,2]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[2,3]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[3,0]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[3,1]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[3,2]) &&
            !float.IsNaN(mRightCamera.projectionMatrix[3,3])
           )
        {
            mRightCameraMatrixOriginal = mRightCamera.projectionMatrix;
            mRightCameraPixelRect = mRightCamera.pixelRect;
            mRightCameraDataAcquired = true;
        }
    }
}