Eunjin Hong's Journal

PROPOSALS

Project 1 Proposal 

Presentation for Project 1 Proposal

End Presentation for Project 1


Project 2 Proposal

Presentation for Project 2 Proposal <ADD LINK>

Poster <ADD LINK>

In-class Activity <ADD LINK>

Public Demo <ADD LINK>

ALL OTHER WIKI CONTRIBUTIONS

CONTRIBUTION 1 : Replace with another link in VR in the performing arts page.

CONTRIBUTION 2 : Artivive as platform for creatives example in VR in the performing arts page.

CONTRIBUTION 3 : Added Cinema4D to VR modeling software page.

CONTRIBUTION 4 : Added AR for real world sensing page.

CONTRIBUTION 5 : Added Comparison page. (OpenXR VS Meta XR SDK Package Comparison)

CONTRIBUTION 6 : Added Capture Passthrough Camera Frames in Unity page.

CONTRIBUTION 7 : Added Passthrough Brightness Controller Tutorial page.

CONTRIBUTION 8 : Added  Passthrough Brightness control by creating lighting spheres page. ( Step by step tutorial, resources, How to make the brightness layer more natural? What does 0.5f convert to in lux light intensity? User experience of lighting modification using lighting spheres in AR) 

CONTRIBUTION 9 : Added URP VS Standard Render Pipeline in Comparison page



LEARNING GOALS 

before after
----  ----
| 1 | 4 | Goal 1: articulate AR/VR visualization software tool goals, requirements, and capabilities;
| 3 | 4 | Goal 2: construct meaningful evaluation strategies for software libraries, frameworks, and applications; strategies include surveys, interviews, comparative use, case studies, and web research;

| 2 | 4 | Goal 3: execute tool evaluation strategies;
| 1 | 3 | Goal 4: build visualization software packages;
| 1 | 3 | Goal 5: comparatively analyze software tools based on evaluation;
| 1 | 4 | Goal 6: be familiar with a number of AR/VR software tools and hardware;
| 1 | 4 | Goal 7: think critically about software;
| 4 | 5 | Goal 8: communicate ideas more clearly;
| 1 | 4 | Goal 9: Know when/where VR&AR data visualization would be effective

HOURS SUMMARY

Total: 76.5 hours (added by 3/19)

HOURS journal

3/19/25 -   1.5 hour

3/17/25 -   2.5 hours


Project Evaluation

I can use...

1. Understanding Environments and Work Practices (UWP) 

2. Evaluating User Performance (UP)

3. Evaluating User Experience (UE)

4. Evaluating Visualization Algorithms (VA) 

3/16/25 -   2 hours

3/14/25 -   1 hour

3/12/25 -   5.5 hours

3/10/25 -   1 hours


3/9/25 -   4 hours


2/27/25 -   7 hours


2/27/25 -   3 hours


2/26/25 -   2 hours


2/24/25 -   3.5 hours


2/21/25 -   4.5 hours

2/20/25 -   3.5 hours


2/19/25 -  2 hours


2/18/25 -  5 hours


2/17/25 -  5 hours

Found out that it gave sample pixel color and light intensity only once. (Frame capture is done every second.)

This could be happening because Texture2D object might not be refreshing as expected or there could be issues with how the pixel data is being accessed or updated in the texture. 


Also Quest Headset kept giving me errors that there was not enough memory to run my project???



--> Releasing render texture that is set to be RenderTexture.active! UnityEngine.RenderTexture:Release () LightEstimator:CaptureCameraFrame (UnityEngine.Camera) (at Assets/LightEstimator.cs:87) LightEstimator/<CaptureLightIntensity>d__4:MoveNext () (at Assets/LightEstimator.cs:36) UnityEngine.SetupCoroutine:InvokeMoveNext (System.Collections.IEnumerator,intptr)

Order of operations was wrong..?


(Also after switching to Standard Render Pipeline the box turned pink)

2/16/25 -  2 hours

2/12/25 -  7 hours

Journal activities are explicitly and clearly related to course deliverables - 5

deliverables are described and attributed in wiki - 3 (New deliverables need to be organized then be uploaded)

report states total amount of time - 5

total time is appropriate - 5

Enable Passthrough API in Meta Quest
Capture Passthrough Camera Frames in Unity
(Before switching to Standard Render Pipeline)

2/10/25 -  4 hours

2/09/25 -  2 hours

2/05/25 -  2 hours

2/03/25 -  5.5 hours

1. Tool for detecting lighting conditions and recommending optimized lighting ; Part 1. detecting light information and displaying UI on pass through.

2. Tool for detecting lighting conditions and recommending optimized lighting ; Part 2. Recommend lighting and simulate. 

three things you will do during the project :
1. Visualize hear rate data in AR environment. (Real time if possible)
2. Design different haptic patterns for different rates.
3. Map haptic patterns to heart rate data and display.


2/01/25 -  1 hour

Google Earth VR

Home on campus

Undergrad-gate

Undergrad

Sci-Li

Google Earth Web

Home on campus

Undergrad-gate

Undergrad

Sci-Li

1/29/25 -  5 hours

1/27/25 -  1 hour

1/26/252 hours

1/24/25 - 2 hours