On this page
AR in Unity: the SDKs, key toolsets + design tips
Unity supports Google’s ARCore and Apple’s ARKit, both of which are poised to bring AR apps to the masses. As a Unity user, you have a unique advantage in shaping and finding success in AR, starting right now! You have all the tools you need for AR content in the editor, and an expanding collection of in-depth learning resources to help you develop as efficiently as possible.
ARCore and ARKit overview
Google’s ARCore can be used on many more Android devices than its predecessor Tango. ARCore is currently in preview, and supports development for Google Pixel or Pixel XL, and Samsung Galaxy S8 running Android 7.0 Nougat and above. It requires Unity 2017.2 or later and Android API SDK v.24 or later.
ARCore Wizard of Oz app, by Google
ARKit for iOS 11: The Unity ARKit plugin provides developers with friendly access to ARKit’s features: motion tracking, live video rendering, plane finding and hit-testing, ambient light estimation, raw point cloud data, and more. There are also convenient Unity components to simplify the creation of new AR apps, or easy integration of AR features in existing Unity projects.
GNOG, by KO_OP
Common functions and limitations between ARKit and ARCore
- Motion tracking support: to get the pose information from the device and apply it to the Unity camera
- Feature points mapping: the feature the device uses to perceive and “map” the real-world environment
- Horizontal surface detection: to pick up table top, floor surfaces
- Video overlay
- Light estimation: to sense how much ambient light is in the scene
- Low light: if there’s not enough light in the real-world environment then it’s difficult for the device to detect feature points, which in turn leads to a breakdown of tracking and plane estimation
- Excessive motion: The user can’t move the device around too fast or the images will blur
- Flat surfaces with no textures: Feature points will not be picked up clearly
- Slow plane detection: there’s a lot of math involved in plane detection, so this can slow down performance
To learn more about the game objects and components specific to each SDK, tune into this great session (between 21:25 and 30:10) by Unity XR engineers Jimmy Alamparambil and Tim Mowrer. Jimmy also wrote up a helpful summary of their Unite session in this blog post.
Coming soon: a cross-platform AR framework!
Our XR engineering team is working hard on a cross-platform framework for the different SDKs, providing the ability to author your AR content once and deploy smoothly to different devices. Currently called ARInterface, it includes common functions, such as StopService/StopService, Pose, Planes callbacks, Light estimation, Point cloud and Camera handling. It’s still in experimental mode, you can download it on GitHub here.
The problem of scale in AR, and how to solve it
Scaling down the actual assets in your AR content is risky. For example, if you have physics on a Game Object, the physics behavior will change and possibly become unstable if you scale things down. It won’t work well on complex levels. And, some systems in Unity can’t really be changed once they’ve been baked down, such as a nav mesh, terrain, lightmaps, or particle systems.
Instead, you can employ some handy camera tricks to give your assets the appearance that they are scaled. You can scale the position of one camera, or use multiple cameras to create the effect of scaling. To get a thorough understanding of this method, read this blog post by Unity engineer Rick Johnson. You can also see a demo of these scaling tips from the 42:05 mark in Jimmy’s and Tim’s talk.
A couple of additional tips: feature points will disappear if the user holds the device still for too long. If device is only seeing one camera image from one orientation, then it’s very difficult for it to perceive world, so app requires some motion, looking around to really start to pick up features.
You need to provide visual and/or audio cues so that the user moves the device around enough that it picks up feature points. Choose carefully where to display visual cues, for example, a focal point where instructions or guidelines appear.
Provide cues to inform them that the action and/or gameplay needs to occur within a few feet of where they are. And, that the lighting in their surroundings should not be too dark or bright, as both these conditions will cause information to be concealed from the cameras. There needs to be a balance of light and shadow so that the cameras can pick up edges of surfaces.
Keep the UI in your AR app simple, tidy and flexible. Don’t overwhelm the user with too much text, UI elements, characters, explosions, and so on. Try and keep the action and elements of your UI contained in one main location.
Finally, give them some options to accommodate for uncontrollable factors, such as people moving unexpectedly into the space where the app is playing. For example, provide the player with some gizmos that allow them to move the content around so they can choose what area and/or surface to play it on.
Seven of the best toolsets you’ll want to use when developing AR content
Vuforia AR Toolkit: Vuforia’s AR toolkit, which is available in versions 2017.2 and up, can be used on top of ARCore, ARKit and Microsoft Hololens, providing additional features for AR content.
Image targets feature from Vuforia
Unity ARKit Remote: This is a feature that can be used with Unity’s ARKit Plugin. Unity ARKit Remote allows you to run a special app on the iOS device which feeds ARKit data back to the Unity Editor, so you can respond to that data in realtime.
Unity UI: You will need to get creative with the UI of your AR game or app, making Unity UI one of the fundamental toolsets you’ll be using. There’s a wealth of tips available for using Unity UI efficiently, including:
Text Mesh Pro: Text Mesh Pro is a great tool to use together with Unity UI to give yourself more flexibility to make rich and dynamic UIs. It’s available for free from the Asset Store.
TextMesh Pro uses advanced text rendering techniques, along with a set of custom shaders; it delivers substantial visual quality improvements, while giving users incredible flexibility for creating text styling and texturing.
Timeline: Timeline is a full, multi-track sequencer natively integrated in Unity. It gives you control over every element for creating cutscenes and sequences, without the need for complex coding. You can sequence animation clips, key frames, audio and more. Here’s a handy overview of the latest learning resources for Timeline.
Analytics: Unity Analytics is not just for when your game goes live: use it when you are conducting testing, for example, to see how players experience and understand the UI of your app. Read more about using Analytics before, during and after you launch.
Don’t overheat it: optimization tips for AR
AR apps demand a lot of their hardware: multiple cameras working simultaneously, depth-sensing capabilities, computationally intensive algorithms, at the same time, it’s largely a mobile/handheld medium, so you’ll need to do some considerable performance optimizations, including managing memory, processing and controlling the physical area of the experience.
Here is an excellent resource for optimizing mobile applications, a number of these tips are also useful for mobile AR content.