Creating UI in Unity
figma > Unity plug in
What Is the Input System Event System?
References for UI design in XR
Meta XR SDKs
Meta XR SDKs are individual SDK packages that provide functionality for XR applications built with virtual reality or mixed reality components for Meta Quest devices. These SDKs offer features that enable you to create immersive user experiences, facilitate social connection, and optimize display hardware.
Meta XR Development Tools
Meta XR Development Tools are software packages that enable you to develop, build, and test XR applications more quickly.
Creating a slider UI in Unity
reference: https://www.youtube.com/watch?v=yhB921bDLYA&t=208s&ab_channel=ValemTutorials
The tutorial above shows how to create a slider UI and add interaction in unity.
1. Setup: Canvas and Event System
Step 1. Create a Canvas:
- Right-click in the Hierarchy → UI > Canvas.
- Unity automatically also creates an Event System (you'll need this later).
Step 2. Change the Canvas Render Mode:
- Select the Canvas.
- Set Render Mode to World Space (not Screen Space Overlay!).
- This lets you move, scale, and rotate the UI like any 3D object.
Step 3. Adjust the Canvas:
- Scale it down (e.g., 0.001 or smaller) so it fits naturally into the VR scene.
- Position it somewhere easy to access, like in front of the player.
2. Build Your UI Elements
Step 4. Add UI components to the Canvas:
- Right-click on Canvas → UI > [Button, Slider, Text, Dropdown, Panel, etc.]
- Use TextMeshPro for better quality text:
- If asked, import TextMeshPro Essentials.
Step 5. Arrange elements:
- Use Unity’s Rect Tool (shortcut T) to resize/move UI parts.
- Example: Add a title (Text), a slider (for options like brightness), and a dropdown (for mode switching).
Step 6. Optional: Add background panels:
- Right-click Canvas → UI > Panel.
- Adjust panel color (e.g., black with transparency) for better readability.
3. Make UI Interactable in VR
Step 7. Update the Canvas component:
- Add Tracked Device Graphic Raycaster to the Canvas (replace the regular Graphic Raycaster).
Step 8. Update the Event System:
- Delete any existing Input Module.
- Add XR UI Input Module to the Event System.
Now the UI will recognize rays from VR controllers!
4. Fix Teleportation and Interaction Bugs
Step 9. Prevent teleportation when interacting with UI:
- When using ray teleportation, sometimes clicking UI also triggers teleport.
- In the Teleportation Ray components:
- Uncheck "Enable Interaction with UI GameObjects".
Optional Advanced: Use ray interactor TryGetHitInfo() to programmatically block teleportation when hovering over UI elements.
5. Create Interactive UI Logic
Step 10. Hook up UI Elements:
- Example: Make a dropdown change your movement type:
- Use Unity's OnValueChanged() to trigger a custom function.
- Example: Slider could adjust volume or brightness.
public void SetTypeFromIndex(int index) {
if (index == 0) {
// Continuous Turn
} else if (index == 1) {
// Snap Turn
}
}
Hook this function into your UI dropdown inside the Inspector.
6. (Optional) Create a Floating VR Menu
Step 11. Create a Menu Manager script:
- Spawn the menu in front of the player with a button press (e.g., Left Controller Menu Button).
- Example logic:
if (showButton.action.WasPressedThisFrame()) {
menu.SetActive(!menu.activeSelf);
menu.transform.position = head.position + head.forward * spawnDistance;
menu.transform.LookAt(new Vector3(head.position.x, menu.transform.position.y, head.position.z));
}
This makes the menu:
- Appear in front of the player
- Face the player wherever they move
Figma > Unity plugin
https://github.com/Volorf/figma-ui-image
https://www.youtube.com/watch?v=AS7gq9zMKYo&ab_channel=OlegFrolov
There is a plugin that you could use to import a design file from figma.
Follow the instructions in the github link
*While following the instruction, search FigmaUIImage.cs , open and replace code with the code below.
using System;
using System.Collections;
using System.Collections.Specialized;
using System.Web;
using UnityEngine;
using UnityEngine.Networking;
using SimpleJSON;
using UnityEditor;
using UnityEngine.Events;
using UnityEngine.UI;
namespace Volorf.FigmaUIImage
{
[ExecuteInEditMode]
[RequireComponent(typeof(RawImage))]
[AddComponentMenu("Volorf/Figma UI Image")]
public class FigmaUIImage : MonoBehaviour, IFigmaImageUpdatable
{
public FigmaUIImageEvent OnUiImageUpdated = new FigmaUIImageEvent();
public UnityEvent OnUploadingFailed = new UnityEvent();
[SerializeField] float imageScale = 2f;
[SerializeField] FigmaUIData figmaUIData;
const string MainFigmaLinkPart = "https://www.figma.com/design/";
const string BaseFigmaImageUrl = "https://api.figma.com/v1/images/";
const string BaseFigmaDocumentUrl = "https://api.figma.com/v1/files/";
string _figmaFileKey;
bool _isLinkValid;
public Texture texture
{
get
{
return _texture;
}
set
{
_texture = value;
SetRawImage(value);
}
}
Texture _texture = default;
Texture _loadingTexture = default;
Texture _defaultTexture = default;
RawImage _rawImage;
float _textureRatio;
Vector2 _textureSize;
string _figmaSelectionName;
public FigmaUIData GetFigmaUIData() => figmaUIData;
public void UpdateFigmaImage()
{
if (figmaUIData == null)
{
Debug.LogError("Add a Figma UI Data to the Figma UI Image");
return;
}
if (figmaUIData.token.Length <= 0)
{
Debug.LogError("Add a Figma token to the Figma UI Data");
return;
}
if (figmaUIData.figmaLink.Length <= 0)
{
Debug.LogError("Add a Figma Link to the Figma UI Data");
return;
}
#if UNITY_EDITOR
texture = GetPreview("FigImagePlaceholder");
#endif
_isLinkValid = true;
SetImageFromFigma();
}
void SetFigmageName(string name)
{
transform.name = name;
}
public float GetScale() => imageScale;
void Awake()
{
_rawImage = GetComponent<RawImage>();
#if UNITY_EDITOR
_defaultTexture = GetPreview("FigImagePlaceholder");
_loadingTexture = GetPreview("FigImageLoading");
#endif
texture = _rawImage.texture == null ? _defaultTexture : _rawImage.texture;
}
void Start()
{
// Debug.LogError("figmaLink from start " + figmaUIData.figmaLink);
// Debug.LogError("token from start " + figmaUIData.token);
if (_rawImage.texture == null)
{
UpdateFigmaImage();
}
}
public RawImage GetRawImage() => _rawImage;
public Texture GetLoadingTexture() => _loadingTexture;
public Texture GetDefaultTexture() => _defaultTexture;
void SetImageFromFigma()
{
if (_rawImage == null) _rawImage = GetComponent<RawImage>();
string fileKey = GetFileKey(figmaUIData.figmaLink);
if (_isLinkValid)
{
string nodeId = GetNodeId(figmaUIData.figmaLink);
string finalImageUrl = CombineImageUrl(BaseFigmaImageUrl, fileKey, nodeId, imageScale);
StartCoroutine(RequestImageLinkFromFigma(finalImageUrl));
string finalDocUrl = CombineDocumentUrl(BaseFigmaDocumentUrl, fileKey, nodeId);
StartCoroutine(RequestDocumentFromFigma(finalDocUrl));
}
else
{
#if UNITY_EDITOR
texture = GetPreview("FigImagePlaceholder");
#endif
}
}
string CombineImageUrl(string baseURL, string fileKey, string nodeId, float scale)
{
NameValueCollection parsedParams = System.Web.HttpUtility.ParseQueryString(String.Empty);
parsedParams.Add("ids", nodeId);
parsedParams.Add("scale", scale.ToString());
return baseURL + fileKey + "?" + parsedParams.ToString();
}
string CombineDocumentUrl(string baseURL, string fileKey, string nodeId)
{
NameValueCollection parsedParams = System.Web.HttpUtility.ParseQueryString(String.Empty);
parsedParams.Add("ids", nodeId);
return baseURL + fileKey + "/" + "nodes" + "?" + parsedParams.ToString();
}
string GetFileKey(string link)
{
string cutFirstPartFigmaLink = link.Replace(MainFigmaLinkPart, "");
int removeIndex = cutFirstPartFigmaLink.IndexOf("/");
if (removeIndex < 0)
{
Debug.LogError("Got an invalid link. Can't parse it.");
_isLinkValid = false;
return link;
}
return cutFirstPartFigmaLink.Substring(0, removeIndex);
}
string GetNodeId(string link)
{
Uri uri = new Uri(link);
return HttpUtility.ParseQueryString(uri.Query).Get("node-id");
}
IEnumerator RequestImageLinkFromFigma(string url)
{
using (UnityWebRequest webRequest = UnityWebRequest.Get(url))
{
webRequest.SetRequestHeader("X-FIGMA-TOKEN", figmaUIData.token?.Trim());
yield return webRequest.SendWebRequest();
switch (webRequest.result)
{
case UnityWebRequest.Result.ConnectionError:
OnUploadingFailed.Invoke();
break;
case UnityWebRequest.Result.DataProcessingError:
OnUploadingFailed.Invoke();
Debug.LogError("Error: " + webRequest.error);
break;
case UnityWebRequest.Result.ProtocolError:
OnUploadingFailed.Invoke();
Debug.LogError("HTTP Error: " + webRequest.error);
break;
case UnityWebRequest.Result.Success:
string js = webRequest.downloadHandler.text;
JSONNode info = JSON.Parse(js);
string linkToImage = info[1][0];
StartCoroutine(RequestImage(linkToImage));
break;
}
}
}
IEnumerator RequestDocumentFromFigma(string url)
{
using (UnityWebRequest webRequest = UnityWebRequest.Get(url))
{
webRequest.SetRequestHeader("X-FIGMA-TOKEN", figmaUIData.token?.Trim());
yield return webRequest.SendWebRequest();
switch (webRequest.result)
{
case UnityWebRequest.Result.ConnectionError:
OnUploadingFailed.Invoke();
break;
case UnityWebRequest.Result.DataProcessingError:
OnUploadingFailed.Invoke();
Debug.LogError("Error: " + webRequest.error);
break;
case UnityWebRequest.Result.ProtocolError:
OnUploadingFailed.Invoke();
Debug.LogError("HTTP Error: " + webRequest.error);
break;
case UnityWebRequest.Result.Success:
string js = webRequest.downloadHandler.text;
JSONNode info = JSON.Parse(js);
// Debug.Log(info);
// 7 is nodes
_figmaSelectionName = info[7][0][0][1];
SetFigmageName(_figmaSelectionName);
break;
}
}
}
void SetRawImage(Texture t)
{
_rawImage.rectTransform.sizeDelta = new Vector2(t.width / imageScale,
t.height / imageScale);
_rawImage.texture = t;
}
IEnumerator RequestImage(string url)
{
UnityWebRequest request = UnityWebRequestTexture.GetTexture(url);
yield return request.SendWebRequest();
switch (request.result)
{
case UnityWebRequest.Result.InProgress:
break;
case UnityWebRequest.Result.Success:
Texture tempTex = DownloadHandlerTexture.GetContent(request);
tempTex.filterMode = FilterMode.Bilinear;
texture = tempTex;
SetRawImage(texture);
FigmaUIImageData figmaUiImageData = new FigmaUIImageData(texture, imageScale, GetCurrentDateTime());
// print("tex width: " + figmaImageData.GetWidth());
// print("tex height: " + figmaImageData.GetHeight());
// print("texRatio: " + figmaImageData.GetRatio());
OnUiImageUpdated.Invoke(figmaUiImageData);
Debug.Log($"{_figmaSelectionName} has been updated.");
break;
case UnityWebRequest.Result.ConnectionError:
OnUploadingFailed.Invoke();
Debug.LogError("Connection Error");
break;
case UnityWebRequest.Result.ProtocolError:
OnUploadingFailed.Invoke();
Debug.LogError("Protocol Error");
break;
case UnityWebRequest.Result.DataProcessingError:
OnUploadingFailed.Invoke();
Debug.LogError("Data Processing Error");
break;
default:
throw new ArgumentOutOfRangeException();
}
}
static Texture GetPreview(string assetName)
{
Texture preview = null;
#if UNITY_EDITOR
string[] links = AssetDatabase.FindAssets(assetName, null);
if (links != null)
{
string path = AssetDatabase.GUIDToAssetPath(links[0]);
preview = (Texture)AssetDatabase.LoadAssetAtPath(path, typeof(Texture));
}
#endif
return preview;
}
public static string GetCurrentDateTime()
{
DateTime curDT = DateTime.Now;
string strD= $"{curDT.Year}.{curDT.Month}.{curDT.Day}";
string strT = $"{curDT.Hour}:{curDT.Minute}:{curDT.Second}";
return $"{strD} {strT}";
}
}
}
What Is the Input System Event System?
In Unity, the Event System is like the brain that listens to input and figures out: “What UI element is being clicked, dragged, or selected?”
There are two major input systems in Unity:
Legacy Input Manager : Old, uses Input.GetAxis, works for keyboard, mouse, and basic joystick
Input System (New) : More powerful, works with VR, gamepads, XR controllers — supports action maps and modern devices
Why It Matters for VR and Meta Quest
On Meta Quest 3, you don’t have a mouse to click buttons or sliders. Instead, you use:
Controller rays (laser pointers)
Hand rays (if using hand tracking)
Joysticks or button input
To make any of those work with Unity UI (like sliders), you need:
An Event System
The right Input Module that understands VR controller input
A Canvas set to World Space
The New Input System + Event System Setup
1. Event System GameObject
This is a Unity system object that listens for input and routes it to UI.
You can create it by Right-click in Hierarchy → UI > Event System
2. Input System UI Input Module
This is the component that tells Unity "Hey, I’m using the new Input System. Let’s hook up gamepads, XR controllers, or hand rays."
It replaces the old system that only worked with mouse + keyboard.
This module supports:
XR controller buttons
Triggers
Thumbsticks
Hand pinch input (if mapped)
3. Canvas Setup
For VR, your UI needs to exist in 3D space so users can aim at it. That means:
Canvas.renderMode = World Space
Set its scale to something small like 0.002 (because 1 = 1 meter)
Add a Graphic Raycaster to it (automatically added when you make a Canvas)
This lets Unity shoot a ray at your UI and detect collisions.
How This All Comes Together in VR
You create UI like a Slider or Button on a World Space Canvas
You have an Event System with Input System UI Input Module
You use a ray (from your controller or hand) that points at the Canvas
Unity sees that ray hit a Slider
When you pull the trigger or pinch — Unity calls OnValueChanged on the slider
This works just like a mouse click in 2D UI — but with VR input.
References
VR Development for Oculus Quest: UI
Creative Core: UI - Unity Learn
How to Make a VR Game in Unity 2022 - PART 7 - User Interface
How to Prototype AR and VR UI/UX with FIGMA and UNITY! (Tutorial) - YouTube
Figma to AR/VR Finally! - Real Spatial Design Like Apple Vision Pro! | Figma + Bezel
A content strategist’s journey into social VR — Facebook Design
Meta Horizon OS UI set for developers
The Meta Horizon OS UI set Figma file contains an essential set of user components designed for immersive experiences.
For unity implementation guides, please refer to Meta Interaction SDK (Unity). To start using these components in Unity, download the Interaction SDK Samples.
Added by : Eunjin Hong 2025/04/27