An Introduction to Editor Scripting

Checked with version: 2018.1

-

Difficulty: Beginner

You can use editor scripting inside Unity to make life easier for your game designers, or even yourself. With a small amount of code, you can automate some of the more tedious aspects of using the inspector to configure your behaviours, and provide visual feedback on configuration changes. We will create a simple projectile and launcher system to demonstrate some of the basics that can be achieved by scripting the editor.

In this tutorial:

  • How to expose methods in the Inspector
  • How to use Handles to create custom Gizmos
  • How to use field attributes to customise the Inspector

Get Started with Simple Techniques

We start with a basic projectile class, which lets the user assign to the rigidbody field, which provides the Physics behaviour. We will then extend this class to make it easier to use.

Code snippet

public class Projectile : MonoBehaviour
{
    public Rigidbody rigidbody;
}

When you add the above component to a GameObject, you will need to also add a Rigidbody component. We can make this happen automatically by using a RequireComponent attribute which will automatically add the Rigidbody component (if it doesn’t already exist) when the Projectile component is first added to a GameObject.

Code snippet

[RequireComponent(typeof(Rigidbody))]
public class Projectile : MonoBehaviour
{
    public Rigidbody rigidbody;
}

Let’s make it even better, by auto-assigning the Rigidbody component to the rigidbody field at the same time. We do this using the Reset method, which is called when you first add the component to a GameObject. You can also call the Reset method manually by right clicking the component header in the inspector, and choosing the ‘Reset’ menu item.

Code snippet

[RequireComponent(typeof(Rigidbody))]
public class Projectile : MonoBehaviour
{
    public Rigidbody rigidbody;
    void Reset()
    {
        rigidbody = GetComponent<Rigidbody>();
    }
}

Finally, we can minimise the valuable screen space taken up the inspector GUI, by hiding the rigidbody field using a HideInInspector attribute. We can also remove editor warnings by using the ‘new’ keyword on the field declaration.

Code snippet

[RequireComponent(typeof(Rigidbody))]
public class Projectile : MonoBehaviour
{
    [HideInInspector] new public Rigidbody rigidbody;
    void Reset()
    {
        rigidbody = GetComponent<Rigidbody>();
    }
}

These are very simple techniques you can use on all your components to keep things clean and tidy, and minimise configuration mistakes.

Simple Inspector Customisation

The next class we look at is the Launcher class. It instantiates a new Projectile and modifies the velocity so that it shoots forward at a specified velocity. (It will actually launch any prefab with a RigidBody component.)

Code snippet

public class Launcher : MonoBehaviour
{
    public Rigidbody projectile;
    public Vector3 offset = Vector3.forward;
    public float velocity = 10;

    public void Fire()
    {
        var body = Instantiate(
            projectile, 
            transform.TransformPoint(offset), 
            transform.rotation);
        body.velocity = Vector3.forward * velocity;
    }
}

The first thing we can add is a Range attribute to the ‘velocity’ field which creates a slider in the inspector GUI. The designer can then quickly slide this value around to experiment with different velocities, or type in an exact figure. We also add a ContextMenu attribute to the ‘Fire’ method, which allows us to run the method by right clicking the component header in the inspector. You can do this with any method (as long as it has zero arguments) to add editor functionality to your component.

Code snippet

public class Launcher : MonoBehaviour
{
    public Rigidbody projectile;
    public Vector3 offset = Vector3.forward;
    [Range(0, 100)] public float velocity = 10;
        
    [ContextMenu("Fire")]
    public void Fire()
    {
        var body = Instantiate(
            projectile, 
            transform.TransformPoint(offset), 
            transform.rotation);
        body.velocity = Vector3.forward * velocity;
    }
}

To take this example further, we need to write a Editor class to extend the editor functionality of the Launcher component. The class has a CustomEditor attribute which tells Unity which component this custom editor is used for. The OnSceneGUI method is called when the scene view is rendered, allowing us to draw widgets inside the scene view. As this is an Editor class, it must be inside a folder named ‘Editor’ somewhere in your project.

Code snippet

using UnityEditor;

[CustomEditor(typeof(Launcher))]
public class LauncherEditor : Editor
{
    void OnSceneGUI()
    {
        var launcher = target as Launcher;
    }
}

Lets add to the OnSceneGUI method so that we can have a widget which allows us to display and adjust the offset position inside the scene view. Because the offset is stored relative to the parent transform, we need to use the Transform.InverseTransformPoint and Transform.TransformPoint methods to convert the offset position into world space for use by the Handles.PositionHandle method, and back to local space for storing in the offset field.

Code snippet

using UnityEditor;

[CustomEditor(typeof(Launcher))]
public class LauncherEditor : Editor
{
    void OnSceneGUI()
    {
        var launcher = target as Launcher;
        var transform = launcher.transform;
        launcher.offset = transform.InverseTransformPoint(
            Handles.PositionHandle(
                transform.TransformPoint(launcher.offset), 
                transform.rotation));
    }
}

We can also create a custom Projectile editor class. Let's add a damageRadius field to the Projectile class, which could be used in the game code to calculate which other GameObjects might be affected by the projectile.

Code snippet

[RequireComponent(typeof(Rigidbody))]
public class Projectile : MonoBehaviour
{
    [HideInInspector] new public Rigidbody rigidbody;
    public float damageRadius = 1;
    
    void Reset()
    {
        rigidbody = GetComponent<Rigidbody>();
    }
}

We might be tempted to add a simple Range attribute to the damageRadius field, however we can do better by visualising this field in the scene view. We create another Editor class for the Projectile component, and use a Handles.RadiusHandle to visualise the field, and allow it to be adjusted in the scene view.

Code snippet

using UnityEditor;

[CustomEditor(typeof(Projectile))]
public class ProjectileEditor : Editor
{
    void OnSceneGUI()
    {
        var projectile = target as Projectile;
        var transform = projectile.transform;
        projectile.damageRadius = Handles.RadiusHandle(
            transform.rotation, 
            transform.position, 
            projectile.damageRadius);
    }
}

We should also add a Gizmo so we can see the Projectile in the scene view when it has no renderable geometry. Here we have used a DrawGizmo attribute to specify a method which is used to draw the gizmo for the Projectile class. This can also be done by implementing OnDrawGizmos and OnDrawGizmosSelected in the Projectile class itself, however it is better practice to keep editor functionality separated from game functionality when possible, so we use the DrawGizmo attribute instead.

Code snippet

using UnityEditor;

[CustomEditor(typeof(Projectile))]
public class ProjectileEditor : Editor
{
    [DrawGizmo(GizmoType.Selected | GizmoType.NonSelected)]
    static void DrawGizmosSelected(Projectile projectile, GizmoType gizmoType)
    {
        Gizmos.DrawSphere(projectile.transform.position, 0.125f);
    }
    
    void OnSceneGUI()
    {
        var projectile = target as Projectile;
        var transform = projectile.transform;
        projectile.damageRadius = Handles.RadiusHandle(
            transform.rotation, 
            transform.position, 
            projectile.damageRadius);
    }
}

Widgets in the Scene View

We can also use Editor IMGUI methods inside OnSceneGUI, to create any kind of scene view editor control. We are going to expose the Fire method of the Launcher component using a button inside the scene view. We calculate a screen space rect right next to the offset world position where we want to draw the GUI. Also, we don’t want to call Fire during edit mode, only when the game is playing, so we wrap Fire method call in a EditorGUI.DisabledGroupScope which will only enable the button when we are in Play mode.

Code snippet

using UnityEditor;

[CustomEditor(typeof(Launcher))]
public class LauncherEditor : Editor
{
    void OnSceneGUI()
    {
        var launcher = target as Launcher;
        var transform = launcher.transform;
        launcher.offset = transform.InverseTransformPoint(
            Handles.PositionHandle(
                transform.TransformPoint(launcher.offset), 
                transform.rotation));
        Handles.BeginGUI();
        var rectMin = Camera.current.WorldToScreenPoint(
            launcher.transform.position + 
            launcher.offset);
        var rect = new Rect();
        rect.xMin = rectMin.x;
        rect.yMin = SceneView.currentDrawingSceneView.position.height - 
            rectMin.y;
        rect.width = 64;
        rect.height = 18;
        GUILayout.BeginArea(rect);
        using (new EditorGUI.DisabledGroupScope(!Application.isPlaying))
        {
            if (GUILayout.Button("Fire"))
                launcher.Fire();
        }
        GUILayout.EndArea();
        Handles.EndGUI();
    }
}

Physics in game design can be hard to debug, so let’s add a helper for the designer which displays an estimate of where the projectile will be after 1 second of flight time. We need the mass of the projectile to calculate this position, therefore we check that the rigidbody field is not null before attempting the calculation. We also draw a dotted line from the launcher object to the offset position for clarity (using Handles.DrawDottedLine), letting the designer know that this position handle is modifying the offset field, not the transform position. Let’s also add a label to the offset handle using Handles.Label.

This is done using a method with a DrawGizmo attribute, in the same way as the ProjectileEditor. We also add an Undo.RecordObject call, which, with the help of EditorGUI.ChangeCheckScope allows us to record an undo operation when the offset is changed. (If you haven’t seen the using statement before, you can read up on it at MSDN.)

Code snippet

using UnityEditor;

[CustomEditor(typeof(Launcher))]
public class LauncherEditor : Editor
{
    [DrawGizmo(GizmoType.Pickable | GizmoType.Selected)]
    static void DrawGizmosSelected(Launcher launcher, GizmoType gizmoType)
    {
        var offsetPosition = launcher.transform.position + launcher.offset;
        Handles.DrawDottedLine(launcher.transform.position, offsetPosition, 3);
        Handles.Label(offsetPosition, "Offset");
        if (launcher.projectile != null)
        {
            var endPosition = offsetPosition + 
                (launcher.transform.forward * 
                launcher.velocity / 
                launcher.projectile.mass);
            using (new Handles.DrawingScope(Color.yellow))
            {
                Handles.DrawDottedLine(offsetPosition, endPosition, 3);
                Gizmos.DrawWireSphere(endPosition, 0.125f);
                Handles.Label(endPosition, "Estimated Position");
            }
        }
    }

    void OnSceneGUI()
    {
        var launcher = target as Launcher;
        var transform = launcher.transform;

            using (var cc = new EditorGUI.ChangeCheckScope())
            {
               var newOffset = transform.InverseTransformPoint(

               Handles.PositionHandle(
                   transform.TransformPoint(launcher.offset),
                   transform.rotation));

               if(cc.changed)
               {
                   Undo.RecordObject(launcher, "Offset Change");
                   launcher.offset = newOffset;
               }
           }

        Handles.BeginGUI();
        var rectMin = Camera.current.WorldToScreenPoint(
            launcher.transform.position + 
            launcher.offset);
        var rect = new Rect();
        rect.xMin = rectMin.x;
        rect.yMin = SceneView.currentDrawingSceneView.position.height - 
            rectMin.y;
        rect.width = 64;
        rect.height = 18;
        GUILayout.BeginArea(rect);
        using (new EditorGUI.DisabledGroupScope(!Application.isPlaying))
        {
            if (GUILayout.Button("Fire"))
                launcher.Fire();
        }
        GUILayout.EndArea();
        Handles.EndGUI();
    }
}
}

If you try this out in your editor, you will notice that the position estimate is not very accurate! Let's change the calculation to take gravity into account, and draw a curved path with Handles.DrawAAPolyLine and Gizmos.DrawWireSphere through the one second flight time trajectory. If we use Handles.DrawingScope to change the colour of the widgets, we don’t need to worry about setting it back to the previous handle colour when the method finishes.

Code snippet

[DrawGizmo(GizmoType.Pickable | GizmoType.Selected)]
static void DrawGizmosSelected(Launcher launcher, GizmoType gizmoType)
{
{
    var offsetPosition = launcher.transform.TransformPoint(launcher.offset);
    Handles.DrawDottedLine(launcher.transform.position, offsetPosition, 3);
    Handles.Label(offsetPosition, "Offset");
    if (launcher.projectile != null)
    {
        var positions = new List<Vector3>();
        var velocity = launcher.transform.forward * 
            launcher.velocity / 
            launcher.projectile.mass;
        var position = offsetPosition;
        var physicsStep = 0.1f;
        for (var i = 0f; i <= 1f; i += physicsStep)
        {
            positions.Add(position);
            position += velocity * physicsStep;
            velocity += Physics.gravity * physicsStep;
        }
        using (new Handles.DrawingScope(Color.yellow))
        {
            Handles.DrawAAPolyLine(positions.ToArray());
            Gizmos.DrawWireSphere(positions[positions.Count - 1], 0.125f);
            Handles.Label(positions[positions.Count - 1], "Estimated Position (1 sec)");
        }
    }
}
}

In Conclusion

These are some very simple ways you can improve the editor experience for other game designers and yourself. Using Editor.OnSceneGUI, you have the ability to create any kind of editor tool, right inside the scene view. It is definitely worthwhile becoming familiar with the Handles class and all the functionality it can provide you, helping you smooth out the game design and development process for yourself and your team.

Scripting

  1. Scripts as Behaviour Components
  2. Variables and Functions
  3. Conventions and Syntax
  4. C# vs JS syntax
  5. IF Statements
  6. Loops
  7. Scope and Access Modifiers
  8. Awake and Start
  9. Update and FixedUpdate
  10. Vector Maths
  11. Enabling and Disabling Components
  12. Activating GameObjects
  13. Translate and Rotate
  14. Look At
  15. Linear Interpolation
  16. Destroy
  17. GetButton and GetKey
  18. GetAxis
  19. OnMouseDown
  20. GetComponent
  21. Delta Time
  22. Data Types
  23. Classes
  24. Instantiate
  25. Arrays
  26. Invoke
  27. Enumerations
  28. Switch Statements
  1. Properties
  2. Ternary Operator
  3. Statics
  4. Method Overloading
  5. Generics
  6. Inheritance
  7. Polymorphism
  8. Member Hiding
  9. Overriding
  10. Interfaces
  11. Extension Methods
  12. Namespaces
  13. Lists and Dictionaries
  14. Coroutines
  15. Quaternions
  16. Delegates
  17. Attributes
  18. Events
  1. Introduction to ECS
  2. Introduction to the Entity Component System and C# Job System
  3. ECS Overview
  4. Implementing Job System
  5. Implementing ECS
  6. Using the Burst Compiler
  1. Building a Custom Inspector
  2. The DrawDefaultInspector Function
  3. Adding Buttons to a Custom Inspector
  4. Unity Editor Extensions – Menu Items
  5. An Introduction to Editor Scripting
  6. Creating a Spline Tool
  1. Simple Clock
  2. MonoDevelop's Debugger
  3. Unity Editor Extensions – Menu Items
  4. Creating Meshes
  1. AssetBundles and the AssetBundle Manager
  2. Mastering Unity Project Folder Structure - Version Control Systems
  1. Installing Tools for Unity Development
  2. Building your first Unity Game with Visual Studio
  3. Editing Unity games in Visual Studio
  4. Debugging Unity games in Visual Studio
  5. Graphics debugging Unity games in Visual Studio
  6. Taking Unity games to Universal Windows Platform
  7. Testing Unity games on Android in Visual Studio
  1. Scripting Primer and Q&A
  2. Scripting Primer and Q&A - Continued
  3. Scripting Primer and Q&A - Continued (Again)
  4. Persistence - Saving and Loading Data
  5. Object Pooling
  6. Introduction to Scriptable Objects
  7. How to communicate between Scripts and GameObjects
  8. Coding in Unity for the Absolute Beginner
  9. Sound Effects & Scripting
  10. Editor Scripting Intro
  11. Writing Plugins
  12. Property Drawers & Custom Inspectors
  13. Events: Creating a simple messaging system
  14. Ability System with Scriptable Objects
  15. Character Select System with Scriptable Objects
  16. Creating Basic Editor Tools
  1. Intro and Setup
  2. Data Classes
  3. Menu Screen
  4. Game UI
  5. Answer Button
  6. Displaying Questions
  7. Click To Answer
  8. Ending The Game and Q&A
  1. Intro To Part Two
  2. High Score with PlayerPrefs
  3. Serialization and Game Data
  4. Loading Game Data via JSON
  5. Loading and Saving via Editor Script
  6. Game Data Editor GUI
  7. Question and Answer
  1. Overview and Goals
  2. Localization Data
  3. Dictionary, JSON and Streaming Assets
  4. Localization Manager
  5. Startup Manager
  6. Localized Text Component
  7. Localized Text Editor Script
  8. Localization Q&A
  1. Introduction and Session Goals
  2. Particle Launcher
  3. Particle Collisions
  4. ParticleLauncher Script
  5. Particle Collisions and Scripting
  6. Random Particle Colors
  7. Drawing Decals with Particles
  8. Collecting Particle Information For Display
  9. Displaying Particles Via Script
  10. Droplet Decals
  11. Questions and Answers
  1. Introduction and Goals
  2. Project Architecture Overview
  3. Creating Rooms
  4. Creating Exits
  5. Text Input
  6. Reacting To String Input
  7. Input Actions And The Delegate Pattern
  8. Questions and Answers
  1. Introduction and Goals
  2. Project Architecture and Review
  3. Displaying Item Descriptions
  4. Examining Items
  5. Taking Items
  6. Displaying Inventory
  7. Action Responses
  8. Preparing The Use Item Dictionary
  9. Creating The Use Action
  10. Questions and Answers