Unity Learn home
View Tutorial Content
Steps

Rendering and Shading

Tutorial
Beginner
+0 XP
1 Hour15 Mins
(111)
Summary
Get an overview of the lighting features new to Unity 5 with this lesson covering realtime global illumination, light types, the lighting panel, as well as emissives and probes.
Select your Unity version
Last updated: November 04, 2022
5.x
Language
English

1.Lighting Overview

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
Type caption for embed (optional)

Get an overview of the new lighting features in Unity 5 with this lesson covering realtime global illumination, light types, the lighting panel, as well as emissives and probes.

2.Lights

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
Type caption for embed (optional)

How to use the light component in Unity to light your scenes.

3.Materials

How to control the visual appearance of gameobjects by assigning materials to control shaders, colours and textures on a renderer.
This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
Type caption for embed (optional)


4.The Standard Shader

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
Type caption for embed (optional)

Discover Physically-based shading with Unity 5's Standard Shader, in this overview lesson.

5.Textures

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
Type caption for embed (optional)

What are Textures? How does Unity use and import them?

6.Using Skyboxes

A skybox is a panoramic texture drawn behind all objects in the scene to represent the sky or any other vista at a great distance. This lesson explains how to use skyboxes in Unity.

Understanding skyboxes

A skybox is a panoramic view split into six textures representing six directions visible along the main axes (up, down, left, right, forward and backward). If the skybox is correctly generated, the texture images will fit together seamlessly at the edges to give a continuous surrounding image that can be viewed from "inside" in any direction. The panorama is rendered behind all other objects in the scene and rotates to match the current orientation of the camera (it doesn't vary with the position of the camera, which is always taken to be at the centre of the panorama). A skybox is thus an easy way to add realism to a scene with minimal load on the graphics hardware.

Using a skybox in Unity

Unity comes with a number of high-quality skyboxes in the Standard Assets package (menu: Assets > Import Package > Skyboxes) but you can also obtain suitable sets of panoramic images from internet sources or generate your own using 3D modelling software.
Assuming you already have the six skybox textures images, you should import them into Unity with the Wrap Mode set to Clamp rather than Repeat (if you don't do this, the edges of the images will not meet up seamlessly).
Select image to expand
Type caption for image (optional)

The skybox itself is actually a type of material using one of the shaders from the RenderFX submenu. If you choose the Skybox shader, you will see an inspector like the following, with six samplers for the textures:-
Select image to expand
Type caption for image (optional)

The Skybox Cubed shader works in much the same way but requires the textures to be added to a cubemap asset (menu: Assets > Create > Cubemap). The cubemap has six texture slots with the same meanings as those of the Skybox material inspector.
Once it is created, you can set the new skybox as the project default using the Render Settings inspector (menu: Edit > Render Settings). You can override the default skybox for each camera by assigning a new one in the camera's Skybox component (visible in the camera's inspector).

7.A Gentle Introduction to Shaders

We can safely say that Unity has made game development easier for a lot of people. Something where it still has a long way to go is, with no doubt, shader coding. Often surrounded by mystery, a shader is a program specifically made to run on a GPU. It is, ultimately, what draws the triangles of your 3D models. Learning how to code shaders is essential if you want to give a special look to your game. Unity also uses them for postprocessing, making them essential for 2D games as well. This tutorial will gently introduce you to shader coding, and is oriented to developers with little to no knowledge about shaders. Tutorials to extend the knowledge gained from this one can be found at Alan Zucconi's site.

Introduction

The diagram below loosely represents the three different entities which plays a role in the rendering workflow of Unity:
Select image to expand
Type caption for image (optional)

3D models are, essentially, a collection of 3D coordinates called vertices. They are connected together to make triangles. Each vertex can contain information, such as a colour, the direction it points towards (called normal) and some coordinates to map textures onto it (called UV data).
Models cannot be rendered without a material. Materials are wrappers which contain a shader and the values for its properties. Hence, different materials can share the same shader, feeding it with different data.

Anatomy of a shader

Unity supports two different types of shaders: surface shaders and fragment and vertex shaders. There is a third type, the fixed function shaders, but they’re now obsolete and will not be covered here. Regardless which type fits your needs, the anatomy of a shader is the same for all of them:
Shader "MyShader" { Properties { // The properties of your shaders // - textures // - colours // - parameters // ... } SubShader { // The code of your shaders // - surface shader // OR // - vertex and fragment shader // OR // - fixed function shader } }
You can have multiple SubShader sections, one after the other. They contain the actual instructions for the GPU. Unity will try to execute them in order, until it finds one that is compatible with your graphics card. This is useful when coding for different platforms, since you can fit different versions of the same shader in a single file.

The properties

The properties of your shader are somehow equivalent to the public fields in a C# script; they’ll appear in the inspector of your material, giving you the chance to tweak them. Unlike what happens with a script, materials are assets: changes made to the properties of a material while the game is running in the editor are permanent. Even after stopping the game, you’ll find the changes you made persisting in your material.
The following snippet covers the definition of all the basic types of properties you can have in a shader:
Properties { _MyTexture ("My texture", 2D) = "white" {} _MyNormalMap ("My normal map", 2D) = "bump" {} //Grey _MyInt ("My integer", Int) = 2 _MyFloat ("My float", Float) = 1.5 _MyRange ("My range", Range(0.0, 1.0)) = 0.5 _MyColor ("My colour", Color) = (1, 0, 0, 1) // (R, G, B, A) _MyVector ("My Vector4", Vector) = (0, 0, 0, 0) // (x, y, z, w) }
The type 2D, used used for _MyTexture and _MyNormalMap, indicates that the parameters are textures. They can be initialised to white, black or gray. You can also use bump to indicate that the texture will be used as a normal map. In this case, it is automatically initialised to the colour #808080, which is used to represent no bump at all. Vectors and Colors always have four elements (XYZW and RGBA, respectively).
The image below shows how these properties appear in the inspector, once the shader is attached to a material.
Select image to expand
Type caption for image (optional)

Unfortunately, this is not enough to use our properties. The section Properties, in fact is used by Unity to give access from the inspector to the hidden variables within a shader. These variables still need to be defined in the actual body of the shader, which is contained in the SubShader section.
SubShader { // Code of the shader // ... sampler2D _MyTexture; sampler2D _MyNormalMap; int _MyInt; float _MyFloat; float _MyRange; half4 _MyColor; float4 _MyVector; // Code of the shader // ... }
The type used for texture is sampler2D. Vectors are float4 and colours are generally half4 which use 32 and 16 bits, respectively. The language used to write shaders, Cg / HLSL, is very pedantic: the name of the parameters must match exactly with the one previously defined. The types, however, don’t need to: you won’t get any error for declaring _MyRange as half, instead of float. Something rather confusing is the fact that if you can define a property of type Vector, which is linked to a float2 variable; the extra two values will be ignored by Unity.

The rendering order

As already mentioned, the SubShader section contains the actual code of the shader, written in Cg / HLSL which closely resembles C. Loosely speaking, the body of a shader is executed for every pixel of your image; performance here is critical. Due to the architecture of the GPUs, there is a limit on the number of instructions you can perform in a shader. It is possible to avoid this dividing the computation in several passes, but this won’t be covered in this tutorial.
The body of a shader, typically looks like this:
SubShader { Tags { "Queue" = "Geometry" "RenderType" = "Opaque" } CGPROGRAM // Cg / HLSL code of the shader // ... ENDCG }
The actual Cg code is contained in the section signalled by the CGPROGRAM and ENDCG directives.
Before the actual body, the concept of tags is introduced. Tags are a way of telling Unity certain properties of the shader we are writing. For instance, the order in which it should be rendered (Queue) and how it should be rendered (RenderType).
When rendering triangles, the GPU usually sort them according to their distance from the camera, so that the further ones are drawn first. This is typically enough to render solid geometries, but it often fails with transparent objects. This is why Unity allows to specify the tag Queue which gives control on the rendering order of each material. Queue accepts integer positive numbers (the smaller it is, the sooner is drawn); mnemonic labels can also be used:
  • Background (1000): used for backgrounds and skyboxes,
  • Geometry (2000): the default label used for most solid objects,
  • Transparent (3000): used for materials with transparent properties, such glass, fire, particles and water;
  • Overlay (4000): used for effects such as lens flares, GUI elements and texts.
Unity also allows to specify relative orders, such as Background+2, which indicates a queue value of 1002. Messing with Queue can generate nasty situations in which an object is always drawn, even when it should be covered by other models.

ZTest

It is important to remember, however, that an object from Transparent doesn’t necessarily always appear above an object from Geometry. The GPU, by default, performs a test called ZTest which stops hidden pixels from being drawn. To work, it uses an extra buffer with the same size of the screen its rendering to. Each pixel contains the depth (distance from the camera) of the object drawn in that pixel. If we are to write a pixel which is further away than the current depth, the pixel is discarded. The ZTest culls the pixel which are hidden by other object, regardless the order in which they are drawn onto the screen.

Surface versus vertex and fragment

The last part which needs to be covered is the actual code of the shader. Before doing this, we’ll have to decide which type of shader to use. This section will give a glimpse of how shaders look like, but it won’t really explain them. Both surface and vertex and fragment shaders will be extensively covered in the next parts of this tutorial.

The surface shader

Whenever the material you want to simulate needs to be affected by lights in a realistic way, chances are you’ll need a surface shader. Surface shaders hide the calculations of how light is reflected and allows to specify “intuitive” properties such as the albedo, the normals, the reflectivity and so on in a function called surf. These values are then plugged into a lighting model which will output the final RGB values for each pixel. Alternatively, you can also write your own lighting model, but this is only needed for very advanced effects.
The Cg code of a typical surface shader looks like this:
CGPROGRAM // Uses the Lambertian lighting model #pragma surface surf Lambert sampler2D _MainTex; // The input texture struct Input { float2 uv_MainTex; }; void surf (Input IN, inout SurfaceOutput o) { o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb; } ENDCG
Select image to expand
Type caption for image (optional)

In this example a texture is input using the line sampler2D _MainTex; , which is then set as the Albedo property of the material in the surf function. The shader uses a Lambertian lighting model, which is a very typical way of modelling how light reflects onto an object. Shaders which only use the albedo property are typically called diffuse.

The vertex and fragment shader

Vertex and fragment shaders work close to the way the GPU renders triangles, and have no built-in concept of how light should behave. The geometry of your model is first passed through a function called vert which can alter its vertices. Then, individual triangles are passed through another function called frag which decides the final RGB colour for every pixel. They are useful for 2D effects, postprocessing and special 3D effects which are too complex to be expressed as surface shaders.
The following vertex and fragment shader simply makes an object uniformly red, with no lighting:
Pass { CGPROGRAM #pragma vertex vert #pragma fragment frag struct vertInput { float4 pos : POSITION; }; struct vertOutput { float4 pos : SV_POSITION; }; vertOutput vert(vertInput input) { vertOutput o; o.pos = mul(UNITY_MATRIX_MVP, input.pos); return o; } half4 frag(vertOutput output) : COLOR { return half4(1.0, 0.0, 0.0, 1.0); } ENDCG }
Select image to expand
Type caption for image (optional)

The vert function converts the vertices from their native 3D space to their final 2D position on the screen. Unity introduces the UNITY_MATRIX_MVP to hide the maths behind it. After this, the return of the frag function gives a red colour to every pixel. Just remember that the Cg section of vertex and fragment shaders need to be enclosed in a Pass section. This is not the case for simple surface shaders, which will work with or without it.

Conclusion

This tutorial gently introduces the two types of shaders available in Unity and explains when to use one over the other. Further tutorials following from this one can be found at Alan Zucconi's site.

8.Using detail textures for extra realism close-up

A detail texture is a pattern that is faded in gradually on a mesh as the camera gets close. This can be used to simulate dirt, weathering or other similar detail on a surface without adding to rendering overhead when the camera is too far away to see the difference. This lesson explains how to use detail textures with Unity.

Obtaining the texture

A detail texture is a greyscale image that is used to lighten or darken another texture selectively. Where a pixel has a brightness value between 0 and 127 the image will be darkened (where zero denotes maximum darkness) and when the value is between 129 and 255, the image will be lightened (where 255 denotes maximum brightness). A value of exactly 128 will leave the underlying image unchanged.
Select image to expand
Type caption for image (optional)

If the detail image has an average brightness greater or lower than 128, the whole image will appear to be lightened or darkened as the camera gets close, which gives the wrong effect. It is therefore important to make sure that the brightness levels in the image are roughly symmetrical around 128. You can check this is the case in most image editing software by looking at the image histogram or the Levels adjustment (which typically also shows the histogram). If the histogram shows a symmetrical "bulge" that is slightly to the left or right of the centre, you can bracket the bulge with the min/max input level arrows to get the brightness centred on 127.
Select image to expand
Type caption for image (optional)

To avoid visible boundaries where the detail texture wraps, you should ideally use an image that tiles perfectly. If the image is created using the image editor's noise function, the results will typically tile without artifacts. Also, filters such as Difference Clouds often have setting to make the resulting image wrap. Simple effects like this can make quite effective detail maps when simulating dirt, grainy surfaces or weathering.

Unity import settings for the detail texture

Once you have saved your image to the Unity project, you can select it to see its import settings in the Inspector.
Select image to expand
Type caption for image (optional)

Set the texture type to Advanced and then enable the Fadeout Mip Maps setting under Generate Mip Maps. You should see a range control for the Fade Range; the numeric values for the range aren't specified but the defaults are suitable for most purposes. For distances below the start of the range, the detail texture will be visible. The range indicates the distance values over which the detail texture will gradually fade before eventually becoming invisible.

The detail material

To use the detail texture, you should set the material to use the Diffuse Detail shader using the menu on the Material Inspector.
Select image to expand
Type caption for image (optional)

In addition to the base texture, you will see a second sampler box to receive the detail texture you have just imported. Typically, you will want to set its Tiling values quite high (maybe about 10).
The detail material can now be applied to any suitable object to show the detailing effect.

9.Frame Debugger

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.
Type caption for embed (optional)

In this video we'll look at how to use Unity's Frame Debugger to analyze and trouble shoot graphical performance.

Rendering and Shading
Rendering and Shading
General Tutorial Discussion
0
1
1. Lighting Overview
0
0
2. Lights
0
0
3. Materials
0
0
4. The Standard Shader
0
0
5. Textures
0
2
6. Using Skyboxes
0
0
7. A Gentle Introduction to Shaders
0
0
8. Using detail textures for extra realism close-up
0
0
9. Frame Debugger
0
0