Getting Started with VR Development

Checked with version: 5.3

-

Difficulty: Beginner

This content has been deprecated, steps to follow may be incorrect but some principles are still valid. Please view our documentation documentation and our getting started resources

The Basics

To get started with the basics of VR development in Unity, you’ll need to ensure your hardware and software is set up as described in the previous article.

Once that’s done and you’ve installed Unity, ensure that the DK2 is connected and powered on before opening Unity. Check that the Demo Scene in the Oculus Configuration Utility is functioning correctly before continuing. You may need to set up a new user in the Oculus Configuration Utility before the Demo Scene will run.

Creating your first VR project

To get started we will create a test VR project in Unity with a cube you can observe in VR. If you prefer to dive into more detailed samples of ready-made VR content, download the VR Samples project mentioned in the overview.

Step 1: Create a new empty project from the Unity Home Screen which loads when you first launch Unity.

Step 2: Make sure that PC, Mac & Linux Standalone is selected as the platform to use by visiting File > Build Settings from the top menu.

Build Settings

Step 3: Create a new cube (Game Object > 3D Object > Cube) and position it in front of the default Main Camera in your new empty scene using the Translate tool.

Basic Cube

Step 4: Save your scene (File > Save Scene).

Step 5: Go to Edit > Project Settings > Player and check the box to enable "Virtual Reality Supported".

Enable VR

Step 6: Enter Play mode by pressing Play at the top of the interface.

You should now be able to view your scene through the DK2, with one camera mirrored onto the Game view. Look around your scene; the camera in Unity will mirror the changes in position and rotation of the DK2.

Troubleshooting: If you cannot view the scene through your DK2, the following tips may be useful:

  • Ensure that the DK2 is plugged in and turned on before opening the Unity project

  • Check that the Demo Scene can be viewed in the Oculus Configuration Utility

  • Update your Graphics Card drivers

  • Make sure that you have Oculus Runtime 0.8 or later installed.

For further assistance with VR, please see the Unity VR forum.

Useful Information for VR development

While VR development is very similar to standard Unity development, there are some differences to be aware of.

Frame rate in the Editor

If you are viewing your project in the editor, be aware that you may experience some lag and judder, as your computer is rendering the same content twice. Rendering the Unity IDE also has some overhead, so it’s best to occasionally check performance on the target devices as well by creating a build of your project, and running it directly.

Camera movement

You cannot move the VR Camera directly in Unity. If you wish to change the position and rotation, you’ll need to ensure that it is parented to another GameObject, and apply the changes to the parents Transform.

Camera Container

For examples of this, look in the Flyer and Maze scenes of the VR Samples project, where we move the camera within the scene.

Camera Nodes

Left and right eye cameras are not created by Unity. If you wish to get the positions of those nodes, you must use the InputTracking class.

Should you wish to get the individual positions of the eyes within the scene - perhaps for testing reasons - use the sample script below, and attach this to your camera.

Code snippet

using UnityEngine;
using UnityEngine.VR;

public class UpdateEyeAnchors : MonoBehaviour
{
    GameObject[] eyes = new GameObject[2];
    string[] eyeAnchorNames = { "LeftEyeAnchor", "RightEyeAnchor" };  
  
    void Update()
    {
        for (int i = 0; i < 2; ++i)
        {
            // If the eye anchor is no longer a child of us, don't use it
            if (eyes[i] != null && eyes[i].transform.parent != transform)
            {
                eyes[i] = null;
            }

            // If we don't have an eye anchor, try to find one or create one
            if (eyes[i] == null)
            {
                Transform t = transform.Find(eyeAnchorNames[i]);
                if (t)
                    eyes[i] = t.gameObject;

                if (eyes[i] == null)
                {
                    eyes[i] = new GameObject(eyeAnchorNames[i]);
                    eyes[i].transform.parent = gameObject.transform;
                }
            }

            // Update the eye transform
            eyes[i].transform.localPosition = InputTracking.GetLocalPosition((VRNode)i);
            eyes[i].transform.localRotation = InputTracking.GetLocalRotation((VRNode)i);
        }
    }
}

Image Effects in VR

Many image effects will be too expensive to use in your VR projects. As you are rendering your scene twice - once for each eye - most image effects which already come at a cost to performance may become too costly to use at all because of their effect on frame rate.

As VR represents a user’s eyes in a virtual space, some image effects won’t make sense in VR. For instance, Depth of Field, blurs, and lens flares generally don’t make sense, as they’re not viewed in real life. If VR HMDs eventually support eye-tracking, Depth of Field could make sense.

Some effects can be used: Anti-Aliasing can be useful (especially given the low resolution of soe HMD devices), Color Grading can definitely be used (see this useful Unity blog for additional information), and Bloom could work in some games, so it’s a good idea to experiment and see whether certain effects are usable in your game.

While Unity supplies some suitable image effects to get you started (Assets > Import Package > Effects), the Asset Store also has many great image effects, such as Colorful, Chromatica, Amplify Color, and many others.

Render Scale

Depending on the complexity of your VR scene and the hardware you’re running, you may want to alter the render scale. This controls the texel : pixel ratio before lens correction, meaning that we trade performance for sharpness.

This must be set via code, and is accessed via http://docs.unity3d.com/ScriptReference/VR.VRSettings-renderScale.html. This can be easily altered using the following script:

Code snippet

using UnityEngine;
using System.Collections;
using UnityEngine.VR;

namespace VRStandardAssets.Examples
{ 
    public class ExampleRenderScale : MonoBehaviour
    {
        [SerializeField] private float m_RenderScale = 1f;              //The render scale. Higher numbers = better quality, but trades performance

        void Start ()
        {
            VRSettings.renderScale = m_RenderScale;
        }
    }
}

You can see a basic example of this in our VR Samples - it is used in the Scenes/Examples/RenderScale scene. It’s also used in the MainMenu scene.

Example of changing the render scale: The default render scale in Unity is 1.0:

Render Scale at 1x

If we increase this to 1.5, you can see the asset looks much crisper:

Render Scale at 1.5x

Finally, if we reduce the renderscale to 0.5, you can see it’s now much more pixelated:

Render Scale at 0.5x

Depending on how well your scene is running, you might wish to lower the render scale to improve performance, or raise it to increase the visual sharpness.

You should now be familiar with the basics of VR integration in Unity, how camera movement must be applied, and how using Image Effects differ compared to non-VR development. In our next article, Interaction in VR, we’ll learn how to interact with objects in VR.