The framebuffer contains the depth, stencil, and color buffers. Color buffers are an essential part and are always present, while other buffers can be present or not depending on the graphics features you use.
If a device supports double or triple buffering, the graphics driver requires two or three framebuffers respectively.
With double buffering (and when VSync is enabled), your application must wait until the next vertical retrace before it starts rendering the next frame. Vertical retraces occur at the vertical refresh rate, typically in the 60–100 Hz range. If supported by the graphics driver, turning off VSync eliminates this delay and provides the highest frame rate. However, it can cause visual artifacts called tearing.
With triple buffering, your application renders a frame in one back buffer (a regular framebuffer). While it is waiting to flip, it starts rendering in the other back buffer. The result is that the frame rate is typically higher than double buffering (and VSync enabled) without any tearing.
Using more than one framebuffer comes with a graphics memory implication, especially on high-resolution displays when your application runs on Native Resolution.
The number of framebuffers in used depends mostly on the graphics driver, and there is one color buffer per framebuffer. For example, when you use OpenGL ES on Android, Unity uses one EGLWindowSurface with a color buffer, but Unity doesn’t have control over how many color buffers and framebuffers it uses. Typically, Unity uses three framebuffers for triple buffering, but if a device does not support it, it falls back to double buffering and use two framebuffers including two color buffers.
The stencil buffer and depth buffer are only bound to the framebuffer if graphics features use them. You should disable them if you know that your application does not require them, because a framebuffer occupies a great deal of graphics memory depending on resolution, and is resource-intensive to create.
To disable the depth buffer and stencil buffer, go to the Player Settings (menu: Edit > Project Settings > Player) window, scroll down to the Resolution and Presentation section and check the Disable Depth and Stencil* checkbox.
On mobile GPUs, the depth buffer and stencil buffer are two separate buffers with 24-bit for the depth buffer and 8bit for the stencil buffer. They are not combined in one buffer unlike on desktop platforms where the buffers are combined into one 32-bit buffer utilizing 24-bit for the depth buffer and 8-bit for the stencil buffer.
Modern mobile phones have a very high resolution for their displays. The native resolution is often way over 1080p. Even for modern consoles, 1080p is difficult to support without a decrease in performance.
Tip: Control the resolution of your application and maybe even expose it so your user can reduce the resolution if they want to save battery life.
Use the Screen.SetResolution command to reduce default resolution and get performance back without losing quality.
Note: Setting the resolution to half of the native resolution might not always yield a positive effect on the visual fidelity.
Calculate the framebuffer size and compare the results you get from a native profiler. For example, a full HD screen would have a resolution of 1920 x 1080, which is 2073600 pixels:
Once you multiply this by the number of bits you use for your color channels resolution, you get 66355200, which is the memory needed in bits.
Now divide it by 8, 1024, and 1024 to get it in Bytes, Kilobytes, and Megabytes.
The following table provides you with the memory by resolution and bit/channel.
|Resolution||Pixel||Bits/Channel||Memory [Bits]||Memory [MB]|
|1920 x 1080||2073600||32||66355200||7.91|
An application running on a Samsung Galaxy S8 with a resolution of 1440*2960 would use 97.68MB of graphics memory for the frame buffers when it operates in triple buffering using a 32-bit color buffer, 24-bit depth buffer, and an 8-bit stencil buffer. Those numbers help you compare memory stats while profiling memory with the native profiler on iOS (IOKit allocations in Instruments) and on Android (EGL mtrack allocations in dumpsys meminfo).
On Android and OpenGLES, Unity creates a framebuffer object with color buffer and depth buffer attachment, which Unity uses for all the rendering. At the end of the frame, Unity blits this framebuffer into the EGLSurface. From Unity 2017.2 you can change the Blit Type. Go to the Player Settings (menu: Edit > Project Settings > Player) window, scroll down to the Resolution and Presentation section and select the Blit Type in the drop-down menu.
When using Vulkan on Android, Unity does not perform the final blit, because doing so interacts with the existing BufferQueue component via the existing ANativeWindow interface, and uses Gralloc HAL for the data. For more details see the official Android documentation.