On this page
Creating AR apps in Unity: the latest tips and resourcesLast updated: November 2018
With AR Foundation and the software architecture it leverages, Unity developers now have a common API which supports core functionality for ARCore, ARKit, and future platforms. ARCore and ARKit each offer a mix platform-unique features, as well as common ones. With AR Foundation and the platform-specific SDKs, you have everything you need in Unity to author AR content, plus and an expanding collection of in-depth learning resources.
ARCore is the plugin developed and maintained by Google. You can download the Unity SDK for it here. ARCore is directly integrated into the engine–there’s a little checkbox for it in the XR settings–and it’s compatible with versions 2017.1 or later.
ARCore Wizard of Oz app, by Google
The ARCore SDK provides sample scene HelloAR that showcases features such as Augmented Images, Cloud Anchors and Computer Vision. Other unique features included in ARCore are Feature Points and Instant Preview.
HelloAR is pre-configured to find both horizontal and vertical planes. It includes a prefab of Andy the Android that is spawned every time the user taps the scene, but you can easily replace this with your own 3D models. One point to note when building on top of the HelloAR scene is that ARCore handles its sessions as scriptable assets. So if you are trying to use certain features on ARCore, you need to make sure that you're using the right session data, and that you've configured your session through there.
The Augmented Image database is an example of a scriptable asset that you assign specifically in your session. Augmented Images enables you to multi-select any images within your project, right-click on them and create a sub-menu inside the Create menu. It then packages up all of those images into a single scriptable asset, and scores the images based on how easy the computer vision can recognize and track those objects (a score of 85 or higher is recommended).
Unlike ARCore, this SDK is developed and maintained by Unity. It works with versions 2017.1 and later, and on Apple devices running iOS 11.3 and later. You can get it here. The sample scenes that come with the SDK include all the key ARKit features, including Focus Square, Shadows and Occlusion, Image Anchor, Plane Mesh, Relocalize and Face Tracking. Other features unique to ARKit are Environment Probes, World Maps and Trackable Images. You can learn more about these in ARKit docs.
AR Foundation is our solution that brings the core functionality from ARKit and ARCore into a common abstracted API inside of Unity. Note that currently, it covers only some of the core functionality. If you want to use platform-specific features such as Environment Probes in ARKit or Instant Preview in ARCore, those are available through the specific SDKs only. New features, however, will be continuously added to AR Foundation. For now, it supports features such as vertical and horizontal plane detection, light estimation, feature points, AR scaling and AR anchors.
AR Foundation is available in versions 2018.1 and later, and can be installed via the Package Manager.
Dan Miller, an XR evangelist at Unity, shows how to install AR Foundation step by step at the 17:30 mark in his Unite session.
GNOG, by KO_OP, created with ARKit
The problem of scale in AR, and how to solve it
Scaling down the actual assets in your AR content is risky. For example, if you have physics on a Game Object, the physics behavior will change and possibly become unstable if you scale things down. It won’t work well on complex levels. And, some systems in Unity can’t really be changed once they’ve been baked down, such as a nav mesh, terrain, lightmaps, or particle systems.
Instead, you can employ some handy camera tricks to give your assets the appearance that they are scaled. You can scale the position of one camera, or use multiple cameras to create the effect of scaling. To get a thorough understanding of this method, read this blog post by Unity engineer Rick Johnson. You can also see a demo of these scaling tips from the 42:05 mark in Jimmy's and Tim’s talk.
A couple of additional tips: feature points will disappear if the user holds the device still for too long. If device is only seeing one camera image from one orientation, then it’s very difficult for it to perceive world, so app requires some motion, looking around to really start to pick up features.
You need to provide visual and/or audio cues so that the user moves the device around enough that it picks up feature points. Choose carefully where to display visual cues, for example, a focal point where instructions or guidelines appear.
Provide cues to inform them that the action and/or gameplay needs to occur within a few feet of where they are. And, that the lighting in their surroundings should not be too dark or bright, as both these conditions will cause information to be concealed from the cameras. There needs to be a balance of light and shadow so that the cameras can pick up edges of surfaces.
Keep the UI in your AR app simple, tidy and flexible. Don’t overwhelm the user with too much text, UI elements, characters, explosions, and so on. Try and keep the action and elements of your UI contained in one main location.
Finally, give them some options to accommodate for uncontrollable factors, such as people moving unexpectedly into the space where the app is playing. For example, provide the player with some gizmos that allow them to move the content around so they can choose what area and/or surface to play it on.
Seven of the best toolsets you’ll want to use when developing AR content
Vuforia AR Toolkit: Vuforia’s AR toolkit, which is available in versions 2017.2 and up, can be used on top of ARCore, ARKit and Microsoft Hololens, providing additional features for AR content.
Image targets feature from Vuforia
Facial AR Remote: Facial AR Remote is a low-overhead way to capture performance using a connected device directly into the Unity editor. It’s useful not just for animation authoring, but also for character and blend shape modeling and rigging, creating a streamlined way to build your own animoji or memoji type interactions in Unity. This allows developers to be able to iterate on the model in the editor without needing to build to the device, removing time-consuming steps in the process. For more details see the docs.
Unity ARKit Remote: This is a feature that can be used with Unity’s ARKit Plugin. Unity ARKit Remote allows you to run a special app on the iOS device which feeds ARKit data back to the Unity Editor, so you can respond to that data in realtime.
Unity UI: You will need to get creative with the UI of your AR game or app, making Unity UI one of the fundamental toolsets you’ll be using. There’s a wealth of tips available for using Unity UI efficiently, including:
Text Mesh Pro: Text Mesh Pro is a great tool to use together with Unity UI to give yourself more flexibility to make rich and dynamic UIs. It’s available for free from the Asset Store.
TextMesh Pro uses advanced text rendering techniques, along with a set of custom shaders; it delivers substantial visual quality improvements, while giving users incredible flexibility for creating text styling and texturing.
Timeline: Timeline is a full, multi-track sequencer natively integrated in Unity. It gives you control over every element for creating cutscenes and sequences, without the need for complex coding. You can sequence animation clips, key frames, audio and more. Here’s a handy overview of the latest learning resources for Timeline.
Particle Systems: Unity provides an extensive list of modules for creating myriad particle systems. Start with this overview of the latest features for Particle Systems. Particle Systems docs are here.
Analytics: Unity Analytics is not just for when your game goes live: use it when you are conducting testing, for example, to see how players experience and understand the UI of your app. Read more about using Analytics before, during and after you launch.
Don’t overheat it: optimization tips for AR
AR apps demand a lot of their hardware: multiple cameras working simultaneously, depth-sensing capabilities, computationally intensive algorithms, at the same time, it’s largely a mobile/handheld medium, so you’ll need to do some considerable performance optimizations, including managing memory, processing and controlling the physical area of the experience.
Here are three handy optimization tips to help you keep things cool in your AR apps.