The Unity-PassthroughCameraAPISamples project helps Unity developers access Quest Camera data via the standard WebCamTexture and Android Camera2 API. It provides:
Two helper classes: WebCamTextureManager for handling permissions and initialization, and PassthroughCameraUtils for retrieving camera metadata and converting 2D to 3D coordinates.
Five sample implementations: including basic camera feed, brightness estimation, object detection with Unity Sentis, and shader-based effects.
This page shows :
Beginner friendly step by step guide to download and try out the samples
Breakdown explanation of the Brightness Estimation script
Human vision and color information processing in digital imaging and video standards.
references :
https://www.youtube.com/watch?v=lhnuP6lJ_yY&ab_channel=DilmerValecillos
https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples
STEP 1: Open Terminal and install Git
> Mac
Copy and paste below into Terminal
git --version
If it says something like git version 2.38.1, you're good!
If it says Git is not installed, just type:
xcode-select --install
That will install the developer tools including Git.
> Windows
Download Git here: https://git-scm.com/downloads
Install it with default settings. Once done, open Git Bash or Command Prompt again and check:
git --version
STEP 2: Install Git LFS (Large File Support)
> Mac
Copy and paste below into Terminal
brew install git-lfs
git lfs install
If you don’t have Homebrew installed, you can install it here: https://brew.sh
> Windows
Download and install Git LFS from here: https://git-lfs.github.com/
Then open Command Prompt and type:
git lfs install
STEP 3: Download Meta’s Project
In Terminal or Command Prompt, type:
git clone https://github.com/oculus-samples/Unity-PassthroughCameraApiSamples
This will create a folder called:
Unity-PassthroughCameraApiSamples
Inside that folder is the Unity project you’ll open next.
STEP 4: Open the Project in Unity
Open Unity Hub
Click "Open project"
Browse to the folder you just downloaded:
Unity-PassthroughCameraApiSamples
Select it and wait for it to load (it may take a minute!)
Once it's open:
Go to Assets > Samples > Scenes > BrightnessEstimation
Open the BrightnessEstimation.unity scene
BrightnessEstimation: Illustrates brightness estimation and how it can be used to adapt the experience to the user’s environment.
Here is a breakdown of the Brightness Estimation Manager code for usage.
What is this script for?
This script runs on a Meta Quest headset using Unity. It estimates the brightness level of the real world based on the image from the passthrough camera feed.
It gives two main things:
A live numeric brightness estimate
A UI text update to show brightness or permission status
Key Components Breakdown
1. Variables (top of script)
[SerializeField] private WebCamTextureManager m_webCamTextureManager;
Connects to the camera feed from Quest’s passthrough.
[SerializeField] private float m_refreshTime = 0.05f;
Updates brightness every 0.05 seconds (50ms).
[SerializeField][Range(1, 100)] private int m_bufferSize = 10;
Keeps the last 10 brightness values to smooth out sudden changes.
[SerializeField] private UnityEvent<float> m_onBrightnessChange;
Unity event you can hook into — e.g., change a material or light brightness when this changes.
[SerializeField] private UnityEngine.UI.Text m_debugger;
A text field in your scene where the brightness info will be displayed.
2. Update() Loop (runs every frame)
var webCamTexture = m_webCamTextureManager.WebCamTexture;
Gets the current camera feed.
If camera feed is active
if (!IsWaiting())
Checks if it's time to update based on m_refreshTime.
m_debugger.text = GetRoomAmbientLight(webCamTexture);
Calculates brightness and shows it on screen.
m_onBrightnessChange?.Invoke(GetGlobalBrigthnessLevel());
Notifies other scripts that the brightness changed.
If the camera isn't available:
m_debugger.text = "No permission granted.";
Displays whether camera permission is missing.
3. Brightness Calculation (GetRoomAmbientLight)
How it works: This is the main method for estimating brightness.
Steps:
Checks if camera is running.
Gets the width and height of the camera texture.
Pulls all pixel colors into m_pixelsBuffer
_ = webCamTexture.GetPixels32(m_pixelsBuffer);
Grabs all pixel color values from the camera image.
float colorSum = 0;
for (...) {
colorSum += 0.2126 * R + 0.7152 * G + 0.0722 * B;
}
Calculates luminance of each pixel using human perception weighting:
Green is weighted most (because our eyes are more sensitive to it). Check the end of the page for more information.
var brightnessVals = Mathf.Floor(colorSum / (w * h));
Averages all pixel brightnesses → single brightness value
m_brightnessVals.Add(brightnessVals);
Adds it to a list of recent brightness values
if (m_brightnessVals.Count > m_bufferSize) ...
Keeps the list size at m_bufferSize (e.g., 10) by removing the oldest
4. Buffer Averaging (GetGlobalBrigthnessLevel)
This averages the last N brightness values so the result doesn’t bounce around too much.
5. Refresh Timer (IsWaiting())
Prevents brightness from updating too often. Only returns true when enough time has passed since the last update.
With the Brightness Estimation Manager, you can:
Display real-time brightness (lux-like) in your UI
Use the m_onBrightnessChange event to change scene lighting or visual effects
Combine with object detection or AR overlays that respond to room lighting
Why is Green Weighted Most?
Human Eye Sensitivity.
Our eyes have three types of color-sensitive cells (cones):
L-cones (long wavelength): see red
M-cones (medium wavelength): see green
S-cones (short wavelength): see blue
➡️ The M-cones (green) are the most sensitive and contribute most to our sense of brightness.
The Luminance Formula You See:
luminance = 0.2126 * R + 0.7152 * G + 0.0722 * B;
This formula comes from the Rec. 709 standard, which is used in:
HDTV
sRGB color space
Video encoding (like H.264)
Image brightness calculations
It reflects how much each RGB channel contributes to perceived brightness based on experiments with human vision.
Reference Sources
ITU-R BT.709 (Rec. 709)
The official standard defining this formula.
https://www.itu.int/rec/R-REC-BT.709-6-201506-I/en
sRGB Specification (used in most screens)
Based on the Rec. 709 color space.
"Digital Image Processing" by Gonzalez & Woods
Common academic reference explaining luminance and perceptual weighting.
Added by : Eunjin Hong 2025/04/06