unity-development
Use this skill when working with Unity game engine - C# scripting, Entity Component System (ECS/DOTS), physics simulation, shader programming (ShaderLab, HLSL, Shader Graph), and UI Toolkit. Triggers on gameplay programming, MonoBehaviour lifecycle, component architecture, rigidbody physics, raycasting, collision handling, custom shader authoring, material configuration, USS styling, UXML layout, and performance optimization for real-time applications. Acts as a senior Unity engineer advisor for game developers building production-quality games and interactive apps.
engineering unitygamedevcsharpecsshadersphysicsWhat is unity-development?
Use this skill when working with Unity game engine - C# scripting, Entity Component System (ECS/DOTS), physics simulation, shader programming (ShaderLab, HLSL, Shader Graph), and UI Toolkit. Triggers on gameplay programming, MonoBehaviour lifecycle, component architecture, rigidbody physics, raycasting, collision handling, custom shader authoring, material configuration, USS styling, UXML layout, and performance optimization for real-time applications. Acts as a senior Unity engineer advisor for game developers building production-quality games and interactive apps.
unity-development
unity-development is a production-ready AI agent skill for claude-code, gemini-cli, openai-codex, and 1 more. Working with Unity game engine - C# scripting, Entity Component System (ECS/DOTS), physics simulation, shader programming (ShaderLab, HLSL, Shader Graph), and UI Toolkit.
Quick Facts
| Field | Value |
|---|---|
| Category | engineering |
| Version | 0.1.0 |
| Platforms | claude-code, gemini-cli, openai-codex, mcp |
| License | MIT |
How to Install
- Make sure you have Node.js installed on your machine.
- Run the following command in your terminal:
npx skills add AbsolutelySkilled/AbsolutelySkilled --skill unity-development- The unity-development skill is now available in your AI coding agent (Claude Code, Gemini CLI, OpenAI Codex, etc.).
Overview
A senior Unity engineer's decision-making framework for building production-quality games and interactive applications. This skill covers five pillars - C# scripting, ECS/DOTS, physics, shaders, and UI Toolkit - with emphasis on when to use each pattern and the trade-offs involved. Designed for developers who know basic Unity concepts and need opinionated guidance on architecture, performance, and best practices for shipping real projects.
Tags
unity gamedev csharp ecs shaders physics
Platforms
- claude-code
- gemini-cli
- openai-codex
- mcp
Related Skills
Pair unity-development with these complementary skills:
Frequently Asked Questions
What is unity-development?
Use this skill when working with Unity game engine - C# scripting, Entity Component System (ECS/DOTS), physics simulation, shader programming (ShaderLab, HLSL, Shader Graph), and UI Toolkit. Triggers on gameplay programming, MonoBehaviour lifecycle, component architecture, rigidbody physics, raycasting, collision handling, custom shader authoring, material configuration, USS styling, UXML layout, and performance optimization for real-time applications. Acts as a senior Unity engineer advisor for game developers building production-quality games and interactive apps.
How do I install unity-development?
Run npx skills add AbsolutelySkilled/AbsolutelySkilled --skill unity-development in your terminal. The skill will be immediately available in your AI coding agent.
What AI agents support unity-development?
This skill works with claude-code, gemini-cli, openai-codex, mcp. Install it once and use it across any supported AI coding agent.
Maintainers
Generated from AbsolutelySkilled
SKILL.md
Unity Development
A senior Unity engineer's decision-making framework for building production-quality games and interactive applications. This skill covers five pillars - C# scripting, ECS/DOTS, physics, shaders, and UI Toolkit - with emphasis on when to use each pattern and the trade-offs involved. Designed for developers who know basic Unity concepts and need opinionated guidance on architecture, performance, and best practices for shipping real projects.
When to use this skill
Trigger this skill when the user:
- Writes or refactors C# scripts for Unity (MonoBehaviour, ScriptableObject, coroutines)
- Architects gameplay systems using component patterns or ECS/DOTS
- Configures rigidbody physics, collision detection, raycasting, or joints
- Authors custom shaders in ShaderLab/HLSL or builds Shader Graph nodes
- Builds UI with UI Toolkit (UXML, USS, C# bindings)
- Optimizes frame rate, memory, draw calls, or GC allocations
- Needs Unity-specific patterns for input handling, scene management, or asset pipelines
- Debugs Unity Editor errors, serialization issues, or build problems
Do NOT trigger this skill for:
- Unreal Engine, Godot, or other non-Unity game engines
- General C# questions unrelated to Unity (use a C#/.NET skill instead)
Key principles
Composition over inheritance - Unity's component model rewards small, focused components attached to GameObjects. Deep MonoBehaviour inheritance hierarchies become brittle. Prefer ScriptableObjects for shared data and interfaces for polymorphic behavior.
Data-oriented thinking - Even before adopting ECS, think about data layout. Avoid scattered heap allocations in hot paths. Cache component references in Awake(). Use struct-based data where possible. The garbage collector is your enemy in a 60fps loop.
Physics and rendering are separate worlds - Physics runs on FixedUpdate at a fixed timestep. Rendering runs on Update at variable framerate. Never mix them. Movement that involves Rigidbody goes in FixedUpdate. Camera follow and input polling go in Update or LateUpdate.
Shaders express intent, not code - A shader describes what a surface looks like under light, not step-by-step instructions. Think in terms of properties (albedo, normal, metallic, emission) and how they respond to lighting. Start with Shader Graph for prototyping, drop to HLSL only when you need fine control.
UI Toolkit is the future, UGUI is the present - UI Toolkit (USS/UXML) follows web-like patterns and is Unity's strategic direction. Use it for editor tools and runtime UI in new projects. Fall back to UGUI only for legacy codebases or when UI Toolkit lacks a specific feature.
Core concepts
Unity's runtime is built on the GameObject-Component architecture. A GameObject is an empty container. Components (MonoBehaviour scripts, Colliders, Renderers) give it behavior and appearance. The Scene is the hierarchy of GameObjects. The Asset Pipeline manages how resources (textures, models, audio) are imported, processed, and bundled.
The MonoBehaviour lifecycle drives script execution: Awake -> OnEnable -> Start -> FixedUpdate (physics) -> Update (frame logic) -> LateUpdate (post-frame cleanup) -> OnDisable -> OnDestroy. Understanding this order prevents 90% of timing bugs.
ECS/DOTS is Unity's data-oriented alternative. Entities replace GameObjects, Components are pure data structs, and Systems contain logic that operates on component queries. ECS delivers massive performance gains for large entity counts (10k+) but requires a fundamentally different coding style.
The Render Pipeline determines how shaders execute. Unity offers URP (Universal Render Pipeline) for cross-platform and HDRP (High Definition) for high-end visuals. Shader code must target the active pipeline - a URP shader won't work in HDRP.
Common tasks
Write a MonoBehaviour with proper lifecycle
Cache references in Awake, subscribe to events in OnEnable, unsubscribe in OnDisable. Never use GetComponent in Update.
public class PlayerController : MonoBehaviour
{
[SerializeField] private float moveSpeed = 5f;
private Rigidbody _rb;
private PlayerInput _input;
private void Awake()
{
_rb = GetComponent<Rigidbody>();
_input = GetComponent<PlayerInput>();
}
private void OnEnable() => _input.onActionTriggered += HandleInput;
private void OnDisable() => _input.onActionTriggered -= HandleInput;
private void FixedUpdate()
{
Vector3 move = new Vector3(_moveDir.x, 0f, _moveDir.y) * moveSpeed;
_rb.MovePosition(_rb.position + move * Time.fixedDeltaTime);
}
private Vector2 _moveDir;
private void HandleInput(InputAction.CallbackContext ctx)
{
if (ctx.action.name == "Move")
_moveDir = ctx.ReadValue<Vector2>();
}
}Use
[SerializeField] privateinstead ofpublicfields. It exposes the field in the Inspector without breaking encapsulation.
Create a ScriptableObject data container
ScriptableObjects live as assets - perfect for shared config, item databases, or event channels that decouple systems.
[CreateAssetMenu(fileName = "WeaponData", menuName = "Game/Weapon Data")]
public class WeaponData : ScriptableObject
{
public string weaponName;
public int damage;
public float fireRate;
public GameObject projectilePrefab;
}Never store runtime-mutable state in ScriptableObjects during Play mode in builds. Changes persist in the Editor but not in built players, causing subtle bugs.
Set up an ECS system with DOTS
Define a component as a struct, then write a system that queries and processes it.
// Component - pure data, no logic
public struct MoveSpeed : IComponentData
{
public float Value;
}
// System - processes all entities with MoveSpeed + LocalTransform
[BurstCompile]
public partial struct MoveForwardSystem : ISystem
{
[BurstCompile]
public void OnUpdate(ref SystemState state)
{
float dt = SystemAPI.Time.DeltaTime;
foreach (var (transform, speed) in
SystemAPI.Query<RefRW<LocalTransform>, RefRO<MoveSpeed>>())
{
transform.ValueRW.Position +=
transform.ValueRO.Forward() * speed.ValueRO.Value * dt;
}
}
}ECS requires the Entities package. Use Burst + Jobs for maximum throughput. Avoid managed types (classes, strings) in components - they break Burst compilation.
Configure physics and collision detection
Choose between discrete (fast, can tunnel through thin objects) and continuous (safe, more expensive) collision detection based on object speed.
// Raycast from camera to detect clickable objects
if (Physics.Raycast(Camera.main.ScreenPointToRay(Input.mousePosition),
out RaycastHit hit, 100f, interactableLayer))
{
hit.collider.GetComponent<IInteractable>()?.Interact();
}Collision matrix rule: Use layers + the Physics Layer Collision Matrix to disable unnecessary collision checks. A "Bullet" layer that only collides with "Enemy" and "Environment" saves significant CPU.
Use
Physics.OverlapSphereNonAllocinstead ofPhysics.OverlapSphereto avoid GC allocations in hot paths. Pre-allocate the results array.
Write a custom URP shader in ShaderLab/HLSL
Minimal unlit shader for URP that supports a base color and texture.
Shader "Custom/SimpleUnlit"
{
Properties
{
_BaseColor ("Color", Color) = (1,1,1,1)
_MainTex ("Texture", 2D) = "white" {}
}
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline"="UniversalPipeline" }
Pass
{
HLSLPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
struct Attributes { float4 posOS : POSITION; float2 uv : TEXCOORD0; };
struct Varyings { float4 posCS : SV_POSITION; float2 uv : TEXCOORD0; };
TEXTURE2D(_MainTex); SAMPLER(sampler_MainTex);
CBUFFER_START(UnityPerMaterial)
float4 _BaseColor;
float4 _MainTex_ST;
CBUFFER_END
Varyings vert(Attributes IN)
{
Varyings OUT;
OUT.posCS = TransformObjectToHClip(IN.posOS.xyz);
OUT.uv = TRANSFORM_TEX(IN.uv, _MainTex);
return OUT;
}
half4 frag(Varyings IN) : SV_Target
{
half4 tex = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, IN.uv);
return tex * _BaseColor;
}
ENDHLSL
}
}
}Always wrap per-material properties in CBUFFER_START(UnityPerMaterial) for SRP Batcher compatibility. Without this, you lose batching and pay per-draw-call cost.
Build runtime UI with UI Toolkit
Define layout in UXML, style with USS, bind data in C#.
<!-- HealthBar.uxml -->
<ui:UXML xmlns:ui="UnityEngine.UIElements">
<ui:VisualElement name="health-bar-container" class="bar-container">
<ui:VisualElement name="health-bar-fill" class="bar-fill" />
<ui:Label name="health-label" class="bar-label" text="100/100" />
</ui:VisualElement>
</ui:UXML>/* HealthBar.uss */
.bar-container {
width: 200px;
height: 24px;
background-color: rgb(40, 40, 40);
border-radius: 4px;
overflow: hidden;
}
.bar-fill {
height: 100%;
width: 100%;
background-color: rgb(0, 200, 50);
transition: width 0.3s ease;
}
.bar-label {
position: absolute;
width: 100%;
-unity-text-align: middle-center;
color: white;
font-size: 12px;
}public class HealthBarUI : MonoBehaviour
{
[SerializeField] private UIDocument uiDocument;
private VisualElement _fill;
private Label _label;
private void OnEnable()
{
var root = uiDocument.rootVisualElement;
_fill = root.Q<VisualElement>("health-bar-fill");
_label = root.Q<Label>("health-label");
}
public void SetHealth(int current, int max)
{
float pct = (float)current / max * 100f;
_fill.style.width = new Length(pct, LengthUnit.Percent);
_label.text = $"{current}/{max}";
}
}UI Toolkit queries (Q, Q
) are string-based name lookups. Cache the results in OnEnable - never call Q() every frame.
Anti-patterns / common mistakes
| Mistake | Why it's wrong | What to do instead |
|---|---|---|
| GetComponent() in Update | Allocates and searches every frame, kills performance | Cache in Awake() or use [RequireComponent] |
| Moving Rigidbody with Transform.position | Bypasses physics engine, breaks collision detection | Use Rigidbody.MovePosition or AddForce in FixedUpdate |
| Using public fields for Inspector exposure | Breaks encapsulation, pollutes the API surface | Use [SerializeField] private fields |
| String-based Find/SendMessage | Fragile, zero compile-time safety, slow | Use direct references, events, or ScriptableObject channels |
| Allocating in hot loops (new List, LINQ) | GC spikes cause frame hitches | Pre-allocate collections, use NonAlloc physics APIs |
| One giant "GameManager" MonoBehaviour | God object that couples everything | Split into focused systems with clear responsibilities |
| Writing shaders without SRP Batcher support | Every material becomes a separate draw call | Use CBUFFER_START(UnityPerMaterial) for all per-material props |
| Mixing UI Toolkit and UGUI in the same screen | Two separate event systems fighting each other | Pick one per UI surface, don't mix |
Gotchas
Modifying a ScriptableObject's values in Play mode persists in the Editor but not in builds - ScriptableObject assets are shared references. Changes made to their fields during Play mode in the Editor are saved to the asset file and persist after stopping. In a build, there is no asset file to save to, so changes are lost on scene reload. Use runtime clones (
Instantiate()) for mutable per-game-session data.OnEnableruns beforeStartbut afterAwakeon scene load - and again on every re-enable - Code inOnEnablethat subscribes to events will subscribe again every time the GameObject is disabled and re-enabled. Always unsubscribe inOnDisable. Missing this causes duplicate event handlers that accumulate across scene loads.Rigidbody interpolation causes visual lag without it, jitter with it misapplied - If you move a Rigidbody in
FixedUpdatewithout interpolation, visual movement is choppy on high-framerate screens. SettingRigidbody.interpolation = Interpolatesmooths rendering but adds one physics frame of lag. Camera follow scripts must run inLateUpdateafter physics resolves to avoid camera jitter.ECS Burst compilation fails silently on managed type references - If a DOTS component or system references a managed type (class, string, array), the Burst compiler silently falls back to non-Burst execution without error. Performance-sensitive systems will run at MonoBehaviour speeds. Use
[BurstDiscard]intentionally and check the Burst Inspector for compilation errors.URP and HDRP shaders are not interchangeable - A shader written for URP (using
UniversalPipelinerender pipeline tag andUniversalForwardPass) will appear as an unlit pink fallback in HDRP, and vice versa. Always specify the target render pipeline in theSubShaderTagsblock and confirm the project's Graphics settings.
References
For detailed patterns and implementation guidance on specific domains, read the
relevant file from the references/ folder:
references/csharp-patterns.md- advanced C# patterns for Unity (object pooling, state machines, dependency injection, async/await)references/ecs-dots.md- deep dive on Entity Component System, Jobs, Burst compiler, and hybrid workflowsreferences/physics-advanced.md- joints, raycasting strategies, trigger volumes, physics layers, continuous collision detectionreferences/shader-programming.md- URP/HDRP shader authoring, Shader Graph custom nodes, lighting models, GPU instancingreferences/ui-toolkit.md- runtime UI patterns, data binding, custom controls, USS advanced selectors, ListView virtualization
Only load a references file if the current task requires it - they are long and will consume context.
References
csharp-patterns.md
C# Patterns for Unity
1. Object Pooling
Instantiate/Destroy cycles cause GC pressure. Pool frequently spawned objects (bullets, particles, enemies) and recycle them.
public class ObjectPool<T> where T : MonoBehaviour
{
private readonly Queue<T> _pool = new();
private readonly T _prefab;
private readonly Transform _parent;
public ObjectPool(T prefab, int initialSize, Transform parent = null)
{
_prefab = prefab;
_parent = parent;
for (int i = 0; i < initialSize; i++)
_pool.Enqueue(CreateInstance());
}
private T CreateInstance()
{
T obj = Object.Instantiate(_prefab, _parent);
obj.gameObject.SetActive(false);
return obj;
}
public T Get()
{
T obj = _pool.Count > 0 ? _pool.Dequeue() : CreateInstance();
obj.gameObject.SetActive(true);
return obj;
}
public void Return(T obj)
{
obj.gameObject.SetActive(false);
_pool.Enqueue(obj);
}
}Unity 2021+ provides UnityEngine.Pool.ObjectPool<T> as a built-in alternative.
Prefer the built-in version for new projects.
2. State Machine Pattern
Use for player controllers, AI behavior, and UI flow. Avoid deeply nested if/else chains in Update.
public interface IState
{
void Enter();
void Execute(); // called each frame
void Exit();
}
public class StateMachine
{
private IState _current;
public void ChangeState(IState newState)
{
_current?.Exit();
_current = newState;
_current.Enter();
}
public void Update() => _current?.Execute();
}
// Usage
public class IdleState : IState
{
private readonly PlayerController _player;
public IdleState(PlayerController player) => _player = player;
public void Enter() => _player.Animator.Play("Idle");
public void Execute()
{
if (_player.MoveInput.sqrMagnitude > 0.01f)
_player.StateMachine.ChangeState(_player.RunState);
}
public void Exit() { }
}For complex AI with many transitions, consider Unity's built-in Animator as a state machine or a dedicated library like NodeCanvas or Behavior Designer.
3. Event-Driven Communication
Decouple systems using ScriptableObject-based event channels instead of direct references or singletons.
[CreateAssetMenu(menuName = "Events/Void Event Channel")]
public class VoidEventChannel : ScriptableObject
{
private readonly HashSet<Action> _listeners = new();
public void Register(Action listener) => _listeners.Add(listener);
public void Unregister(Action listener) => _listeners.Remove(listener);
public void Raise()
{
foreach (var listener in _listeners)
listener?.Invoke();
}
}
// Generic version for typed events
[CreateAssetMenu(menuName = "Events/Int Event Channel")]
public class IntEventChannel : ScriptableObject
{
private readonly HashSet<Action<int>> _listeners = new();
public void Register(Action<int> listener) => _listeners.Add(listener);
public void Unregister(Action<int> listener) => _listeners.Remove(listener);
public void Raise(int value)
{
foreach (var listener in _listeners)
listener?.Invoke(value);
}
}Wire these in the Inspector - drag the same ScriptableObject asset into both the publisher and subscriber. No compile-time coupling between systems.
4. Async/Await in Unity
UniTask is the recommended library for async/await in Unity. It avoids the
Task allocations and integrates with Unity's player loop.
using Cysharp.Threading.Tasks;
public class AsyncExample : MonoBehaviour
{
private async UniTaskVoid Start()
{
// Wait for 2 seconds without coroutine allocation
await UniTask.Delay(TimeSpan.FromSeconds(2));
// Await a web request
var request = UnityWebRequest.Get("https://api.example.com/data");
await request.SendWebRequest();
if (request.result == UnityWebRequest.Result.Success)
Debug.Log(request.downloadHandler.text);
}
// Cancel on destroy to prevent accessing destroyed objects
private CancellationTokenSource _cts;
private void OnEnable() => _cts = new CancellationTokenSource();
private void OnDisable() => _cts?.Cancel();
private async UniTask FadeOut(CanvasGroup group)
{
while (group.alpha > 0)
{
group.alpha -= Time.deltaTime;
await UniTask.Yield(_cts.Token);
}
}
}Always pass a CancellationToken tied to the MonoBehaviour's lifetime. Without it, async operations continue after the object is destroyed, causing NullReferenceExceptions.
5. Dependency Injection (Lightweight)
For small-to-mid projects, constructor injection via a simple service locator avoids the weight of full DI frameworks.
public static class ServiceLocator
{
private static readonly Dictionary<Type, object> _services = new();
public static void Register<T>(T service) => _services[typeof(T)] = service;
public static T Get<T>() => (T)_services[typeof(T)];
public static void Clear() => _services.Clear();
}For large projects, use VContainer (lightweight, Unity-native) or Zenject (feature-rich). Both support scene-scoped lifetimes and constructor injection for MonoBehaviours.
6. Coroutine vs Update vs InvokeRepeating
| Pattern | Use when | Avoid when |
|---|---|---|
Coroutine (IEnumerator) |
One-shot sequences, timed delays, animations | Tight loops needing cancellation control |
| Update + timer float | Continuous per-frame logic, countdown timers | Simple delays (use coroutine instead) |
| InvokeRepeating | Fixed-interval polling, heartbeats | Need to pass parameters or cancel precisely |
| UniTask async | Web requests, file I/O, complex async flows | Very simple delays in non-critical code |
// Coroutine approach
private IEnumerator SpawnWave(int count, float interval)
{
for (int i = 0; i < count; i++)
{
SpawnEnemy();
yield return new WaitForSeconds(interval);
}
}Cache WaitForSeconds objects if reusing the same delay value. Each new WaitForSeconds
allocates on the heap.
7. Serialization Gotchas
Unity's serializer has specific rules that trip up experienced C# developers:
privatefields are NOT serialized unless marked[SerializeField]publicfields ARE serialized (even if you don't want them in the Inspector)static,const,readonlyfields are never serialized- Properties are never serialized
- Dictionaries are not serialized - use two parallel lists or a custom serializable wrapper
- Interfaces and abstract types need
[SerializeReference]attribute (Unity 2019.3+) - Polymorphic serialization requires
[SerializeReference]on the field
[Serializable]
public class DialogueLine
{
[SerializeField] private string speaker;
[SerializeField] private string text;
[SerializeField, TextArea] private string longText;
}
public class DialogueSystem : MonoBehaviour
{
// This works - serialized as a list of the concrete type
[SerializeField] private List<DialogueLine> lines;
// This requires [SerializeReference] for polymorphism
[SerializeReference] private List<IDialogueNode> nodes;
} ecs-dots.md
ECS / DOTS Reference
1. Architecture Overview
ECS (Entity Component System) is Unity's data-oriented tech stack (DOTS). It replaces the traditional GameObject/MonoBehaviour model with a cache-friendly, parallelizable architecture.
| Concept | Traditional | ECS |
|---|---|---|
| Identity | GameObject | Entity (lightweight int ID) |
| Data | MonoBehaviour fields | IComponentData struct |
| Logic | MonoBehaviour.Update() | ISystem.OnUpdate() |
| Grouping | Transform hierarchy | Archetypes (component combos) |
When to use ECS: 10,000+ entities with similar behavior (bullets, particles, NPCs, terrain chunks). Below that threshold, the traditional model is simpler and usually fast enough.
2. Components
Components are plain structs. No methods, no inheritance, no managed types.
// Simple data component
public struct Health : IComponentData
{
public float Current;
public float Max;
}
// Tag component (zero-size, used for filtering)
public struct EnemyTag : IComponentData { }
// Buffer element (variable-length per-entity data)
[InternalBufferCapacity(8)]
public struct DamageBufferElement : IBufferElementData
{
public float Value;
public Entity Source;
}
// Shared component (same value shared across many entities - use sparingly)
public struct TeamId : ISharedComponentData
{
public int Value;
}
// Enableable component (toggled on/off without structural changes)
public struct Stunned : IComponentData, IEnableableComponent { }Rules:
- No
classtypes,string, or arrays inside components (breaks Burst) - Use
FixedString64Bytesinstead ofstring - Use
DynamicBuffer<T>instead of arrays - Shared components cause archetype fragmentation - use only for truly shared data
3. Systems
Systems contain all logic. They query for entities with specific component combos.
[BurstCompile]
public partial struct DamageSystem : ISystem
{
[BurstCompile]
public void OnUpdate(ref SystemState state)
{
var ecb = new EntityCommandBuffer(Allocator.Temp);
foreach (var (health, buffer, entity) in
SystemAPI.Query<RefRW<Health>, DynamicBuffer<DamageBufferElement>>()
.WithEntityAccess())
{
foreach (var dmg in buffer)
health.ValueRW.Current -= dmg.Value;
buffer.Clear();
if (health.ValueRO.Current <= 0f)
ecb.DestroyEntity(entity);
}
ecb.Playback(state.EntityManager);
ecb.Dispose();
}
}System ordering: Use [UpdateBefore(typeof(OtherSystem))] and
[UpdateAfter(typeof(OtherSystem))] attributes. Group related systems with
[UpdateInGroup(typeof(SimulationSystemGroup))].
Built-in system groups (execution order):
InitializationSystemGroupSimulationSystemGroup(default - most gameplay systems go here)PresentationSystemGroup(rendering-related)
4. Entity Command Buffers (ECB)
Structural changes (create/destroy entity, add/remove component) cannot happen during iteration. Use ECBs to defer them.
// From a system using the built-in ECB system
[BurstCompile]
public partial struct SpawnSystem : ISystem
{
[BurstCompile]
public void OnUpdate(ref SystemState state)
{
var ecbSingleton = SystemAPI.GetSingleton<BeginSimulationEntityCommandBufferSystem.Singleton>();
var ecb = ecbSingleton.CreateCommandBuffer(state.WorldUnmanaged);
foreach (var (spawner, transform) in
SystemAPI.Query<RefRW<Spawner>, RefRO<LocalTransform>>())
{
spawner.ValueRW.Timer -= SystemAPI.Time.DeltaTime;
if (spawner.ValueRO.Timer <= 0f)
{
Entity e = ecb.Instantiate(spawner.ValueRO.Prefab);
ecb.SetComponent(e, LocalTransform.FromPosition(transform.ValueRO.Position));
spawner.ValueRW.Timer = spawner.ValueRO.Interval;
}
}
}
}ECB timing: Use BeginSimulationEntityCommandBufferSystem for changes that
should apply at the start of the next frame. Use EndSimulationEntityCommandBufferSystem
for end-of-frame cleanup.
5. Jobs and Burst
For CPU-intensive work, schedule jobs that run on worker threads.
[BurstCompile]
public partial struct MoveJob : IJobEntity
{
public float DeltaTime;
public void Execute(ref LocalTransform transform, in MoveSpeed speed)
{
transform.Position += transform.Forward() * speed.Value * DeltaTime;
}
}
// In the system:
[BurstCompile]
public partial struct MoveSystem : ISystem
{
[BurstCompile]
public void OnUpdate(ref SystemState state)
{
new MoveJob { DeltaTime = SystemAPI.Time.DeltaTime }.ScheduleParallel();
}
}Burst constraints:
- No managed types (classes, strings, delegates)
- No try/catch blocks
- No virtual method calls
- Use
NativeArray,NativeList,NativeHashMapfor collections - Use
[ReadOnly]attribute on job fields that are read-only (enables parallelism)
6. Baking (Authoring to Runtime)
Baking converts GameObjects in subscenes into ECS entities at build time.
// Authoring component (MonoBehaviour in the Editor)
public class SpeedAuthoring : MonoBehaviour
{
public float speed = 10f;
}
// Baker converts it to ECS component
public class SpeedBaker : Baker<SpeedAuthoring>
{
public override void Bake(SpeedAuthoring authoring)
{
Entity entity = GetEntity(TransformUsageFlags.Dynamic);
AddComponent(entity, new MoveSpeed { Value = authoring.speed });
}
}Subscenes are the entry point. Place authored GameObjects in a subscene, and Unity bakes them to entities. At runtime, the subscene loads as serialized entity data - much faster than instantiating GameObjects.
7. Hybrid Approach
You don't have to go all-in on ECS. Common hybrid patterns:
- ECS for simulation, GameObjects for presentation - entities hold data, companion GameObjects hold meshes and particle systems
- Managed components for bridging -
class IComponentDatacan hold managed references but loses Burst/Jobs compatibility - SystemBase (managed system) when you need access to managed APIs (UnityEngine.Object, MonoBehaviour references)
// Managed system - no Burst, but can access managed types
public partial class AudioSystem : SystemBase
{
protected override void OnUpdate()
{
Entities.ForEach((ref PlaySoundRequest request) =>
{
AudioSource.PlayClipAtPoint(request.Clip, request.Position);
}).WithoutBurst().Run(); // Run() = main thread, no jobs
}
}Use the hybrid approach to incrementally adopt ECS. Don't rewrite your entire game - identify the hot systems (movement, AI, spawning) and migrate those first.
8. Performance Checklist
- Components are blittable structs (no managed types)
- Systems use
[BurstCompile]attribute - Jobs use
ScheduleParallel()when no write conflicts exist - Read-only job fields are marked
[ReadOnly] - ECBs are used for structural changes (not direct EntityManager calls in loops)
- Shared components are used sparingly (each unique value creates an archetype)
- Queries use
WithAll,WithNone,WithAnyto narrow scope - NativeContainers are disposed after use (or use
Allocator.Temp)
physics-advanced.md
Physics Advanced Reference
1. Collision Detection Modes
| Mode | Use for | Cost |
|---|---|---|
| Discrete | Slow objects (characters, crates) | Cheapest |
| Continuous | Fast objects that must not tunnel (bullets) | Medium |
| Continuous Dynamic | Fast objects hitting other fast objects | Expensive |
| Continuous Speculative | Fast kinematic objects | Medium |
Set on the Rigidbody component. Default is Discrete - only change when you observe tunneling artifacts.
2. Raycasting Strategies
// Basic raycast
if (Physics.Raycast(origin, direction, out RaycastHit hit, maxDistance, layerMask))
{
Debug.Log($"Hit {hit.collider.name} at {hit.point}");
}
// Non-allocating multi-hit (pre-allocate buffer)
private readonly RaycastHit[] _hitBuffer = new RaycastHit[16];
public int RaycastNonAlloc(Vector3 origin, Vector3 dir, float dist, int layer)
{
return Physics.RaycastNonAlloc(origin, dir, _hitBuffer, dist, layer);
}
// SphereCast for "fat" raycasts (useful for aim assist)
Physics.SphereCast(origin, radius: 0.5f, direction, out RaycastHit hit, maxDistance);
// OverlapSphere for area detection (non-alloc)
private readonly Collider[] _overlapBuffer = new Collider[32];
public int DetectNearby(Vector3 center, float radius, int layer)
{
return Physics.OverlapSphereNonAlloc(center, radius, _overlapBuffer, layer);
}Performance rules:
- Always pass a
layerMaskto avoid testing every collider in the scene - Use
NonAllocvariants to avoid GC allocations - Pre-allocate buffers as class fields, not local variables
- Use
QueryTriggerInteraction.Ignorewhen you don't need trigger hits
3. Trigger Volumes
Triggers detect overlap without physical collision response. Use for pickup zones, damage areas, quest triggers, and proximity detection.
// Requires a Collider with "Is Trigger" checked on the GameObject
public class DamageZone : MonoBehaviour
{
[SerializeField] private float damagePerSecond = 10f;
private void OnTriggerStay(Collider other)
{
if (other.TryGetComponent<Health>(out var health))
health.TakeDamage(damagePerSecond * Time.fixedDeltaTime);
}
}Trigger callbacks require:
- At least one of the two objects has a Rigidbody
- At least one collider has "Is Trigger" enabled
- Both layers must be enabled in the collision matrix
| Callback | When fired |
|---|---|
| OnTriggerEnter | First frame of overlap |
| OnTriggerStay | Every FixedUpdate while overlapping |
| OnTriggerExit | First frame after overlap ends |
4. Physics Layers and Collision Matrix
Use layers to control which objects can collide. This is the single most impactful physics optimization.
Setup:
- Define layers in Project Settings > Tags and Layers (up to 32 layers)
- Configure collisions in Project Settings > Physics > Layer Collision Matrix
- Uncheck every pair that should never interact
Common layer setup:
| Layer | Collides with |
|---|---|
| Player | Environment, Enemy, Pickup, Trigger |
| Enemy | Environment, Player, EnemyProjectile |
| PlayerBullet | Enemy, Environment |
| EnemyProjectile | Player, Environment |
| Trigger | Player only |
| UI | Nothing (raycasts only) |
// Set layer in code
gameObject.layer = LayerMask.NameToLayer("PlayerBullet");
// Create layermask for raycasts
int mask = LayerMask.GetMask("Enemy", "Environment");
Physics.Raycast(ray, out hit, 100f, mask);5. Joints
Joints constrain Rigidbody movement relative to another body or a point in space.
| Joint | Use for |
|---|---|
| Fixed Joint | Gluing objects together (breakable walls, attached items) |
| Hinge Joint | Doors, levers, rotating platforms |
| Spring Joint | Bouncy connections, suspension, grappling hooks |
| Configurable Joint | Custom constraints on any axis (ragdolls, vehicles) |
| Character Joint | Ragdoll limbs (limits on each rotation axis) |
// Create a spring joint at runtime
var spring = gameObject.AddComponent<SpringJoint>();
spring.connectedBody = targetRigidbody;
spring.spring = 500f; // stiffness
spring.damper = 50f; // damping force
spring.maxDistance = 2f; // rest length
spring.breakForce = 1000f; // force to break the jointRagdoll tip: Use Unity's ragdoll wizard (GameObject > 3D Object > Ragdoll) for initial setup, then tune joint limits and mass distribution manually. Set all limb Rigidbodies to kinematic during animation, then enable physics on death.
6. Physics Materials
PhysicMaterial controls friction and bounciness on colliders.
| Property | Range | Effect |
|---|---|---|
| Dynamic Friction | 0-1 | Friction while moving |
| Static Friction | 0-1 | Friction to start moving |
| Bounciness | 0-1 | 0 = no bounce, 1 = full bounce |
| Friction Combine | Average/Min/Max/Multiply | How two materials combine |
| Bounce Combine | Average/Min/Max/Multiply | How two materials combine |
// Create physics material in code
var material = new PhysicMaterial("Ice")
{
dynamicFriction = 0.05f,
staticFriction = 0.05f,
bounciness = 0f,
frictionCombine = PhysicMaterialCombine.Minimum
};
collider.material = material;7. FixedUpdate vs Update for Physics
| Action | Where | Why |
|---|---|---|
| Rigidbody.MovePosition | FixedUpdate | Syncs with physics timestep |
| Rigidbody.AddForce | FixedUpdate | Force accumulates per physics step |
| Input polling | Update | Input is sampled per frame, not per physics tick |
| Camera follow | LateUpdate | After all movement is resolved |
| Raycast for aim | Update | Matches visual frame, not physics frame |
// Common pattern: read input in Update, apply in FixedUpdate
private Vector2 _inputDir;
private void Update()
{
_inputDir = new Vector2(Input.GetAxis("Horizontal"), Input.GetAxis("Vertical"));
}
private void FixedUpdate()
{
_rb.AddForce(new Vector3(_inputDir.x, 0, _inputDir.y) * moveForce);
}Time.fixedDeltaTime defaults to 0.02s (50Hz). Increase for performance-sensitive games (0.03-0.04), decrease for physics-heavy simulations (0.01). Never set it below 0.005 - it multiplies CPU cost linearly.
8. 2D vs 3D Physics
Unity has two completely separate physics engines. They do not interact.
| Feature | 3D (PhysX) | 2D (Box2D) |
|---|---|---|
| Rigidbody | Rigidbody |
Rigidbody2D |
| Collider | BoxCollider, SphereCollider |
BoxCollider2D, CircleCollider2D |
| Raycast | Physics.Raycast |
Physics2D.Raycast |
| Callbacks | OnCollisionEnter(Collision) |
OnCollisionEnter2D(Collision2D) |
| Gravity | 3-axis | 2-axis (default Y only) |
Do not mix 2D and 3D physics components on the same GameObject. A Rigidbody2D ignores a BoxCollider (3D) and vice versa.
shader-programming.md
Shader Programming Reference
1. Render Pipeline Compatibility
Shaders are pipeline-specific. A shader written for Built-in won't work in URP/HDRP.
| Pipeline | Shader Language | Include Path | Lit Base Shader |
|---|---|---|---|
| Built-in | CG/HLSL | UnityCG.cginc |
Standard |
| URP | HLSL | Packages/com.unity.render-pipelines.universal/ShaderLibrary/ |
Universal Render Pipeline/Lit |
| HDRP | HLSL | Packages/com.unity.render-pipelines.high-definition/Runtime/ShaderLibrary/ |
HDRP/Lit |
Rule: Check which pipeline the project uses before writing any shader code. URP is the most common choice for cross-platform games.
2. ShaderLab Structure
Every Unity shader follows the ShaderLab wrapper format:
Shader "Category/ShaderName"
{
Properties
{
// Exposed to Inspector and material API
_PropertyName ("Display Name", Type) = DefaultValue
}
SubShader
{
Tags { "RenderType"="Opaque" "Queue"="Geometry" }
Pass
{
// Shader program goes here (HLSL or CG)
}
}
FallBack "Diffuse" // fallback if hardware can't run this shader
}Property types:
| ShaderLab Type | C# Type | Example |
|---|---|---|
Color |
Color |
_Color ("Tint", Color) = (1,1,1,1) |
Float |
float |
_Glossiness ("Smooth", Range(0,1)) = 0.5 |
2D |
Texture2D |
_MainTex ("Albedo", 2D) = "white" {} |
Vector |
Vector4 |
_Wind ("Wind Dir", Vector) = (1,0,0,0) |
Int |
int |
_Stencil ("Stencil", Int) = 0 |
3. URP Lit Shader from Scratch
A complete URP-compatible lit shader with diffuse + normal mapping:
Shader "Custom/URPLit"
{
Properties
{
_BaseMap ("Albedo", 2D) = "white" {}
_BaseColor ("Color", Color) = (1,1,1,1)
_BumpMap ("Normal Map", 2D) = "bump" {}
_BumpScale ("Normal Scale", Float) = 1.0
_Metallic ("Metallic", Range(0,1)) = 0.0
_Smoothness ("Smoothness", Range(0,1)) = 0.5
}
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline"="UniversalPipeline" "Queue"="Geometry" }
Pass
{
Name "ForwardLit"
Tags { "LightMode"="UniversalForward" }
HLSLPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS
#pragma multi_compile _ _ADDITIONAL_LIGHTS
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
struct Attributes
{
float4 positionOS : POSITION;
float3 normalOS : NORMAL;
float4 tangentOS : TANGENT;
float2 uv : TEXCOORD0;
};
struct Varyings
{
float4 positionCS : SV_POSITION;
float2 uv : TEXCOORD0;
float3 positionWS : TEXCOORD1;
float3 normalWS : TEXCOORD2;
float3 tangentWS : TEXCOORD3;
float3 bitangentWS: TEXCOORD4;
};
TEXTURE2D(_BaseMap); SAMPLER(sampler_BaseMap);
TEXTURE2D(_BumpMap); SAMPLER(sampler_BumpMap);
CBUFFER_START(UnityPerMaterial)
float4 _BaseMap_ST;
float4 _BaseColor;
float _BumpScale;
float _Metallic;
float _Smoothness;
CBUFFER_END
Varyings vert(Attributes IN)
{
Varyings OUT;
VertexPositionInputs posInputs = GetVertexPositionInputs(IN.positionOS.xyz);
VertexNormalInputs normInputs = GetVertexNormalInputs(IN.normalOS, IN.tangentOS);
OUT.positionCS = posInputs.positionCS;
OUT.positionWS = posInputs.positionWS;
OUT.uv = TRANSFORM_TEX(IN.uv, _BaseMap);
OUT.normalWS = normInputs.normalWS;
OUT.tangentWS = normInputs.tangentWS;
OUT.bitangentWS = normInputs.bitangentWS;
return OUT;
}
half4 frag(Varyings IN) : SV_Target
{
half4 albedo = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, IN.uv) * _BaseColor;
half3 normalTS = UnpackNormalScale(
SAMPLE_TEXTURE2D(_BumpMap, sampler_BumpMap, IN.uv), _BumpScale);
half3 normalWS = TransformTangentToWorld(normalTS,
half3x3(IN.tangentWS, IN.bitangentWS, IN.normalWS));
InputData inputData = (InputData)0;
inputData.positionWS = IN.positionWS;
inputData.normalizedScreenSpaceUV = GetNormalizedScreenSpaceUV(IN.positionCS);
inputData.normalWS = normalize(normalWS);
inputData.viewDirectionWS = GetWorldSpaceNormalizeViewDir(IN.positionWS);
SurfaceData surfaceData = (SurfaceData)0;
surfaceData.albedo = albedo.rgb;
surfaceData.metallic = _Metallic;
surfaceData.smoothness = _Smoothness;
surfaceData.alpha = albedo.a;
return UniversalFragmentPBR(inputData, surfaceData);
}
ENDHLSL
}
}
}4. Shader Graph
Shader Graph is the node-based visual shader editor. Preferred for:
- Artists who don't write HLSL
- Rapid prototyping of visual effects
- Shaders that need frequent iteration
Key nodes:
| Node | Purpose |
|---|---|
| Sample Texture 2D | Read a texture at UV coordinates |
| Fresnel Effect | Edge glow / rim lighting |
| Noise (Gradient/Simple/Voronoi) | Procedural patterns |
| Lerp | Blend between two values |
| Time | Animate properties |
| UV | Access/modify texture coordinates |
| Normal Vector | Surface normal in world/object/tangent space |
| Custom Function | Embed raw HLSL for operations Shader Graph can't express |
Custom Function node pattern:
// Create a .hlsl file, reference it in Custom Function node
void MyCustomFunction_float(float3 In, out float3 Out)
{
Out = In * 0.5 + 0.5; // remap -1..1 to 0..1
}5. SRP Batcher Compatibility
The SRP Batcher reduces draw call overhead by batching materials that share the same shader variant. To be compatible:
- All per-material properties must be inside
CBUFFER_START(UnityPerMaterial) - All per-object built-in properties must be inside
CBUFFER_START(UnityPerDraw) - Do not use
MaterialPropertyBlock(it breaks SRP Batcher for that renderer)
Check compatibility in Frame Debugger > SRP Batcher column.
6. GPU Instancing
For rendering many copies of the same mesh (trees, rocks, grass):
#pragma multi_compile_instancing
// In vertex shader
UNITY_SETUP_INSTANCE_ID(IN);
UNITY_TRANSFER_INSTANCE_ID(IN, OUT);
// Per-instance properties
UNITY_INSTANCING_BUFFER_START(Props)
UNITY_DEFINE_INSTANCED_PROP(float4, _Color)
UNITY_INSTANCING_BUFFER_END(Props)
// Access in fragment
half4 col = UNITY_ACCESS_INSTANCED_PROP(Props, _Color);When to use instancing vs SRP Batcher:
- SRP Batcher: different materials, same shader variant (default, always prefer)
- GPU Instancing: same material + mesh, per-instance property variation
- Both: can't coexist on the same draw call; SRP Batcher takes priority
7. Common Shader Techniques
Dissolve Effect
// In Properties
_DissolveAmount ("Dissolve", Range(0,1)) = 0
_DissolveTex ("Dissolve Noise", 2D) = "white" {}
// In fragment
half noise = SAMPLE_TEXTURE2D(_DissolveTex, sampler_DissolveTex, IN.uv).r;
clip(noise - _DissolveAmount); // discard pixel if below thresholdScrolling UV (Water, Lava)
float2 scrolledUV = IN.uv + _Time.y * _ScrollSpeed;
half4 tex = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, scrolledUV);Rim Lighting / Fresnel
float3 viewDir = normalize(_WorldSpaceCameraPos - IN.positionWS);
float rim = 1.0 - saturate(dot(viewDir, IN.normalWS));
rim = pow(rim, _RimPower);
half3 rimColor = rim * _RimColor.rgb;Vertex Displacement (Wind)
// In vertex shader
float wave = sin(_Time.y * _WindSpeed + IN.positionOS.x * _WindFrequency);
OUT.positionCS = TransformObjectToHClip(
IN.positionOS.xyz + float3(wave * _WindStrength, 0, 0));8. Debugging Shaders
| Tool | Purpose |
|---|---|
| Frame Debugger | Inspect every draw call, see shader state |
| RenderDoc | GPU-level debugging, shader stepping |
| Shader compilation errors | Check Console; line numbers reference the HLSL block |
#pragma enable_d3d11_debug_symbols |
Add debug info for RenderDoc |
| Output solid color | Replace frag return with return half4(1,0,0,1) to verify the shader runs |
| Visualize normals | return half4(IN.normalWS * 0.5 + 0.5, 1) |
ui-toolkit.md
UI Toolkit Reference
1. Architecture Overview
UI Toolkit is Unity's retained-mode UI system inspired by web technologies.
| Concept | Web Equivalent | Unity Name |
|---|---|---|
| HTML | DOM elements | UXML (VisualElement tree) |
| CSS | Stylesheets | USS (Unity Style Sheets) |
| JavaScript | Event handlers | C# (UQuery + event callbacks) |
| Shadow DOM | Component encapsulation | Custom VisualElements |
When to use UI Toolkit vs UGUI:
- UI Toolkit: Editor extensions, HUD overlays, menus, settings screens, new projects
- UGUI: World-space UI in 3D scenes, projects already using UGUI extensively, features not yet in UI Toolkit (e.g., some advanced text effects)
2. UXML Layout
UXML defines the visual hierarchy. Think of it as HTML for Unity.
<ui:UXML xmlns:ui="UnityEngine.UIElements" xmlns:uie="UnityEditor.UIElements">
<ui:VisualElement name="root" class="container">
<ui:Label text="Player Stats" class="header" />
<ui:VisualElement class="stat-row">
<ui:Label text="Health" class="stat-label" />
<ui:ProgressBar name="health-bar" value="75" high-value="100" />
</ui:VisualElement>
<ui:Button name="heal-btn" text="Heal" class="action-btn" />
<ui:ScrollView name="inventory-scroll">
<ui:ListView name="item-list" />
</ui:ScrollView>
</ui:VisualElement>
</ui:UXML>Built-in elements:
| Element | Use for |
|---|---|
VisualElement |
Generic container (like <div>) |
Label |
Text display |
Button |
Clickable actions |
TextField |
Text input |
Toggle |
Boolean checkbox |
Slider / SliderInt |
Numeric range input |
ProgressBar |
Value display bar |
ScrollView |
Scrollable container |
ListView |
Virtualized list (handles 10k+ items) |
Foldout |
Collapsible section |
DropdownField |
Select from options |
RadioButton / RadioButtonGroup |
Exclusive selection |
3. USS Styling
USS follows CSS syntax with Unity-specific properties prefixed with -unity-.
/* Base container */
.container {
flex-grow: 1;
padding: 16px;
background-color: rgba(0, 0, 0, 0.8);
}
/* Flexbox layout (default is column) */
.stat-row {
flex-direction: row;
justify-content: space-between;
align-items: center;
margin-bottom: 8px;
}
/* Typography */
.header {
font-size: 24px;
-unity-font-style: bold;
-unity-text-align: middle-center;
color: rgb(255, 220, 100);
margin-bottom: 16px;
}
/* Buttons with hover state */
.action-btn {
height: 40px;
border-radius: 6px;
background-color: rgb(60, 120, 200);
color: white;
-unity-font-style: bold;
transition: background-color 0.2s ease;
}
.action-btn:hover {
background-color: rgb(80, 150, 240);
}
.action-btn:active {
background-color: rgb(40, 90, 160);
}Key differences from CSS:
- No
pxunits in USS - all numeric values are unitless (interpreted as pixels) or use% - Use
-unity-prefix for Unity-specific properties - Flexbox is the only layout model (no grid, no float)
- Default flex-direction is
column(not row) - No media queries - use C# to adapt to screen size
- Selectors:
.class,#name,Type,:hover,:active,:focus,:checked
4. C# Bindings and Events
Query elements and wire up logic in C#.
public class StatsUI : MonoBehaviour
{
[SerializeField] private UIDocument uiDocument;
[SerializeField] private StyleSheet additionalStyles;
private ProgressBar _healthBar;
private Button _healBtn;
private ListView _itemList;
private void OnEnable()
{
var root = uiDocument.rootVisualElement;
// Optional: add stylesheet at runtime
root.styleSheets.Add(additionalStyles);
// Query by name (# selector equivalent)
_healthBar = root.Q<ProgressBar>("health-bar");
_healBtn = root.Q<Button>("heal-btn");
_itemList = root.Q<ListView>("item-list");
// Query by class (. selector equivalent)
var allLabels = root.Query<Label>(className: "stat-label").ToList();
// Register events
_healBtn.clicked += OnHealClicked;
// Generic event registration
_healBtn.RegisterCallback<PointerEnterEvent>(OnHoverStart);
_healBtn.RegisterCallback<PointerLeaveEvent>(OnHoverEnd);
}
private void OnDisable()
{
_healBtn.clicked -= OnHealClicked;
_healBtn.UnregisterCallback<PointerEnterEvent>(OnHoverStart);
_healBtn.UnregisterCallback<PointerLeaveEvent>(OnHoverEnd);
}
private void OnHealClicked() => Debug.Log("Heal!");
private void OnHoverStart(PointerEnterEvent evt) => Debug.Log("Hover");
private void OnHoverEnd(PointerLeaveEvent evt) => Debug.Log("Leave");
public void UpdateHealth(int current, int max)
{
_healthBar.value = current;
_healthBar.highValue = max;
_healthBar.title = $"{current}/{max}";
}
}5. ListView (Virtualized)
ListView only creates VisualElements for visible rows. Essential for large lists.
public class InventoryUI : MonoBehaviour
{
[SerializeField] private UIDocument uiDocument;
[SerializeField] private VisualTreeAsset itemTemplate; // UXML for one row
private List<ItemData> _items;
private ListView _listView;
private void OnEnable()
{
_listView = uiDocument.rootVisualElement.Q<ListView>("item-list");
_listView.makeItem = () => itemTemplate.Instantiate();
_listView.bindItem = (element, index) =>
{
var item = _items[index];
element.Q<Label>("item-name").text = item.Name;
element.Q<Label>("item-count").text = $"x{item.Count}";
};
_listView.itemsSource = _items;
_listView.fixedItemHeight = 40; // required for virtualization
_listView.selectionType = SelectionType.Single;
_listView.selectionChanged += OnSelectionChanged;
}
private void OnSelectionChanged(IEnumerable<object> selection)
{
foreach (ItemData item in selection)
Debug.Log($"Selected: {item.Name}");
}
public void RefreshList()
{
_listView.RefreshItems(); // call after data changes
}
}6. Custom Controls
Create reusable UI components by extending VisualElement.
// Custom control with UXML attribute support
[UxmlElement]
public partial class HealthBar : VisualElement
{
[UxmlAttribute]
public float MaxHealth { get; set; } = 100f;
[UxmlAttribute]
public float CurrentHealth { get; set; } = 100f;
private VisualElement _fill;
private Label _label;
public HealthBar()
{
// Build internal structure
var container = new VisualElement();
container.AddToClassList("health-container");
_fill = new VisualElement();
_fill.AddToClassList("health-fill");
container.Add(_fill);
_label = new Label();
_label.AddToClassList("health-label");
container.Add(_label);
Add(container);
RegisterCallback<AttachToPanelEvent>(OnAttach);
}
private void OnAttach(AttachToPanelEvent evt) => Refresh();
public void SetHealth(float current)
{
CurrentHealth = Mathf.Clamp(current, 0, MaxHealth);
Refresh();
}
private void Refresh()
{
float pct = MaxHealth > 0 ? CurrentHealth / MaxHealth * 100f : 0f;
_fill.style.width = new Length(pct, LengthUnit.Percent);
_label.text = $"{CurrentHealth:F0}/{MaxHealth:F0}";
}
}Use in UXML after the custom control is defined:
<HealthBar max-health="200" current-health="150" />7. Transitions and Animations
USS supports transitions for smooth property changes:
.panel {
translate: -100% 0;
opacity: 0;
transition: translate 0.3s ease-out, opacity 0.3s ease-out;
}
.panel.visible {
translate: 0 0;
opacity: 1;
}// Trigger transition by toggling class
panel.AddToClassList("visible"); // slides in
panel.RemoveToClassList("visible"); // slides outFor complex animations, use schedule.Execute or VisualElement.experimental.animation:
// Delayed execution
element.schedule.Execute(() => element.AddToClassList("visible"))
.StartingIn(500); // 500ms delay
// Value animation
element.experimental.animation
.Start(0f, 1f, 300, (e, val) => e.style.opacity = val)
.Ease(Easing.OutCubic);8. Performance Tips
- Cache Q() results - string lookups are not free; query once in OnEnable
- Use ListView for lists > 20 items - it virtualizes, ScrollView does not
- Minimize style recalculations - batch class changes, avoid toggling styles per frame
- Use USS transitions - they run on the UI thread efficiently vs manual animation
- Avoid VisualElement allocation in Update - create elements once, show/hide with
display: none - Set
pickingMode = PickingMode.Ignoreon decorative elements to skip hit testing
Frequently Asked Questions
What is unity-development?
Use this skill when working with Unity game engine - C# scripting, Entity Component System (ECS/DOTS), physics simulation, shader programming (ShaderLab, HLSL, Shader Graph), and UI Toolkit. Triggers on gameplay programming, MonoBehaviour lifecycle, component architecture, rigidbody physics, raycasting, collision handling, custom shader authoring, material configuration, USS styling, UXML layout, and performance optimization for real-time applications. Acts as a senior Unity engineer advisor for game developers building production-quality games and interactive apps.
How do I install unity-development?
Run npx skills add AbsolutelySkilled/AbsolutelySkilled --skill unity-development in your terminal. The skill will be immediately available in your AI coding agent.
What AI agents support unity-development?
unity-development works with claude-code, gemini-cli, openai-codex, mcp. Install it once and use it across any supported AI coding agent.