User Interfaces for VR
Checked with version: 5.3
When designing user interfaces for VR there are a number of things to consider that may not have come to light in traditional screen design scenarios. Here we will look at some of the challenges and opportunities afforded to you as a VR developer, and discuss some of the hardware practicalities of designing usable interfaces for this new medium.
UI Resolution and Appearance
As the resolution on DK2 is 1920 x 1080 (960 x 1080 per eye), and the Gear VR is 2560 x 1440 (1280 x 1440 per eye), this can lead to some noticeable pixelation on anything that occupies a few pixels in width or height.
Of particular note are UI elements; bear in mind how large these will appear on-screen. One approach is to use larger or bold fonts, and designing UI without thin lines that can become pixelated when viewed in VR.
Types of UI
In non-VR projects, UI is frequently overlaid on top of the screen to show things like health, score, and so on as what we often refer to as a HUD (Heads Up Display). This is known as non-diegetic UI - it doesn’t exist within the world, but makes sense for the player in the context of viewing the game.
This term is also used in film for things like non-diegetic sound - this could be music in a film or TV show. Whereas diegetic sound would be what makes sense to hear based on what you are seeing - character dialogue or sound effects for example.
In Unity, adding HUD style non-diegetic UI is usually accomplished by using Screen Space - Overlay or Screen Space - Camera render modes on a UI Canvas.
This approach usually doesn’t work in VR - our eyes are unable to focus on something so close, and Screen Space-Overlay is not supported in Unity VR.
Instead, we generally need to position our UI within the environment itself using World Space Canvas render mode. This will allow our eyes to focus on the UI. This is known as Spatial UI.
Placement of the UI within the world also needs some consideration. Too close to the user can cause eye strain, and too far away can feel like focussing on the horizon - this might work in an outdoor environment, but not in a small room. You’ll also need to scale the size of the UI accordingly, and perhaps dynamically depending on your needs.
If possible, it’s best to position your UI at a comfortable reading distance, and scale it accordingly. See the UI in Main Menu for an example of this: It’s positioned a few meters away, and the text and images are large and easy to view.
If you’re positioning the UI at a certain distance from the user, you may find the UI clipping into other objects. Take a look at the Reticle in the Interaction in VR article for how to modify a shader to draw on top of other objects, or simply use the shader included in the VR Samples. This shader can be used with text if you also need to stop it from clipping.
Many developers will initially attach the UI to the camera, so that as the player moves around the UI will stay in a fixed position. While this could be useful for a reticle or something small, for larger UI elements this often has the effect of holding a newspaper in front of your face while looking around, and can lead to user discomfort or nausea. Take a look at the UI in Shooter 360 (Target Arena), where the UI will move into view after a short delay (see gif below), allowing the user to look around and become familiar with the environment without a UI element fixed to their field of view, obscuring their vision.
While VR provides us with the opportunity to explore immersive 360-degree environments, sometimes you might need to indicate that the user needs to look in a specific direction. In some of our scenes we use arrows within the world to help direct the user’s attention towards a direction. These fade in and out depending on the direction that the user is facing.
These can be found in the GUIArrows prefab, and are easy to reuse. They work by comparing the angle of the head compared to the desired direction. If the head is outside of a predefined angle (see Show Angle in the GUIArrows component below), then the arrows will fade in. When the user looks back towards the desired direction, they will begin to fade out.
An alternative to Spatial UI is to have elements in the environment itself display information to the user. This could be a working clock on the wall, a TV, computer display, mobile phone, or a holographic display on a futuristic gun. This is known as Diegetic UI.
Take a look at the UI on the ship in the Flyer scene, or on the gun in the Shooter (Target) scenes:
While this might not be strictly considered diegetic UI, having the interface attached to the object gives a reasonable approximation of how diegetic user interfaces might work in Unity.
Further reading on UI
A thorough analysis of these types of UI - without referencing VR - can be found in this article on GamaSutra.
By using the VREyeRaycaster, VRInput, and VRInteractiveItem as described in the “Interaction in VR” article, we can create basic interaction with UI elements by creating a class that subscribes to the VRInteractiveItem events.
See the “Interaction in VR” article for more information on this - specifically the switch in the Maze scene for an example. We also use UI interaction at the start of each game to confirm that the user has read the instructions.
For more information on using Unity UI in VR, take a look at the “Unity’s UI System in VR” post on the Oculus blog, where they also provide code samples.
UI in the VR Sample Scenes
Let’s take a look at how some of the techniques described above are demonstrated in the VR Sample scenes.
The UI in the Menu scene uses custom meshes to achieve a curved, enclosed environment. Interaction with these meshes uses the same methods as detailed in the “Interaction in VR” article.
The introduction and end-game UI is positioned statically in world-space:
However, to represent information pertinent to the game we chose world-space UI attached the position of the ship, to act as diegetic UI.
As the user will always have the ship in view, it makes sense to have relevant information around the focal area.
The UI also rotates to face the camera; this avoids oblique angles, and ensures the UI is always legible to the user.
In the Maze scene, we also use Spatial UI for the intro and outro sections:
Spatial UI is also used to prompt the player to interact when the switch is activated:
Shooter 180 (Target Gallery)
Static Spatial UI is again used for the intro and outro sections:
As mentioned above, we use diegetic UI on the gun to represent the time left and score:
Shooter 360 (Target Arena)
Finally, spatial UI is used in this scene, but with a slight twist: As the action takes place in a 360-degree arena, as the player looks around, we chose to move the UI into view after a short delay - rotating it around horizontally to meet the player. This is intended to help the user realise they’re in an environment which requires them to freely look around.
Once again, the diegetic UI is reused on the gun:
Free antialiasing on text for VR
A quick tip to achieve free (from a rendering cost standpoint) anti-aliasing on text in Unity: Use a Canvas Scaler on a Worldspace Canvas. The UI should have a “Reference Pixels Per Unit” setting of 1, then alter “Dynamic Pixels Per Unit” until you slightly soften the edges of the text. Here you can see the difference between a setting of 3 Dynamic Pixels Per Unit - where the edges look sharp - and an example of the Dynamic Pixels Per Unit being set to 1.75, which gives a much softer edge.
You should now have an understanding of the different types of user interface, and which ones work well in VR, along with how to overcome certain challenges you might face. Using VREyeRaycaster, VRInput, and VRInteractiveItem, you can also create basic UI interaction.
Further information can be found in the “Unity’s UI System in VR” post on the Oculus blog.
The next article will help you to understand Movement in VR.