GOALS
before after
---- ----
1 | 5 | Goal 0: example goal showing "novice" score before and "expert mentor" after
2 | 4 | Goal 1: articulate VR visualization software tool goals, requirements, and capabilities;
3 | 4 | Goal 2: construct meaningful evaluation strategies for software libraries, frameworks, and applications; strategies include surveys, interviews, comparative use, case studies, and web research;
2 | 4 | Goal 3: execute tool evaluation strategies;
3 | 5 | Goal 4: build visualization software packages;
3 | 4 | Goal 5: comparatively analyze software tools based on evaluation;
3 | 4 | Goal 6: be familiar with a number of VR software tools and hardware;
2 | 4 | Goal 7: think critically about software;
3 | 4 | Goal 8: communicate ideas more clearly;
2 | 4 | Goal 9: be familiar with different modes of data visualization in VR;
Project 1 Proposal: Project Proposal 1 Link (this proposal can also be found in my journal entry for 2/12/23)
Update Presentation for Project 1: Update Presentation for Project 1 Link
End Presentation for Project 1: Presentation for Project 1 Link
Git Repository for Project 1: ashleykwon/PrimateDataVisualization: Project for Brown University's CSCI 1951T (Spring 2023) (github.com)
APK file for Project 1: APK file for Project 1
In-class activity for Project 1: Ashley's in-class activity
In-class activity results for Project 1: https://docs.google.com/spreadsheets/d/1sxgy0RRHWNpc-l6kDxvIfzjMTwgDW8m3NTL88gZWi-M/edit?usp=sharing
Project 2 Proposal Project 2 Proposal Link
Update Presentation for Project 2: Update Presentation for Project 2 Link
End Presentation for Project 2: Presentation for Project 2 Link
In-class activity Ashley's in-class activity (Project 2)
Git Repository for Project 2: ashleykwon/VisionSimulator (github.com)
APK file for Project 2: APK file for Project 2
In-class activity results for Project 2: https://docs.google.com/spreadsheets/d/19QxQeLy5TBIKIO-RwmbWrKQafzHEdG4bEtHWQCv1BSI/edit?usp=sharing
Poster: Poster Link
Flash Talk: Flash Talk Link
Public Demo: Public Demo Link
CONTRIBUTION 1 [how to load a 360 image into a Unity VR project for Oculus Quest] Visualizing a 360 degree image in VR
CONTRIBUTION 2 [how to debug a Unity project using Android Debug Bridge] Debugging with Android Debug Bridge
CONTRIBUTION 3 [how to load JSON files in a Unity project] Loading JSON Files in Unity
CONTRIBUTION 4 [how to make a custom terrain in Unity] Making a custom terrain in Unity
CONTRIBUTION 5 [how to convert a CSV file into a JSON file in Python] Converting CSV to JSON in Python
CONTRIBUTION 6 [how to implement passthrough in Unity] Unity Oculus Integration Passthrough
CONTRIBUTION 7 [how to build 2D UI for Oculus] UI for Oculus
CONTRIBUTION 8 [how to write a custom shader program in Unity] Changing 3D object colors with Unity Shader
Total: 143 hours
1/29/23 - 4 Hours
Created this journal page
Joined the Slack channel and introduced myself
Read Kenny Gruchalla's bio and wrote questions to him
Reviewed the course homepage
Completed one of the 10-minute changes (the one that says COMPLETED)
Here're some changes to the VR Software Wiki page I suggest:
Changes that require ~10 minutes to complete
On the Home page, in the Where do I start? section, there should be another sub-section that is dedicated solely to AR. There are some devices that do not support both AR and VR. So there should be a separate section, possibly titled "I want to learn more about AR and its differences with VR", where existing tabs such as "AR vs VR" and "List of AR software" should belong.
COMPLETED - In the section titled What is VR for?, some subpages such as VR in the Military and VR in Engineering lack outlines. These outlines should be added as in other subpages to provide summaries of VR technologies in different fields.
In the Contents section on the Home page, "VR scientific visualization software" should be updated to "VR Visualization Software" to match the title on the menu on the left-hand side and to prevent confusion about whether the section is only about scientific visualization or visualization in general.
Changes that require ~1 hour to complete
The VR Hardware page should be updated to include information about AR/VR devices such as Meta Quest Pro and Magic Leap 2 that were recently released
The Related Technology page, especially in its sub-pages that explain differences between AR and VR, can mention passthrough, which allows users to see their surroundings as a video while wearing a VR headset. This technology allows users to implement AR using a VR headset.
The VR Development Software page, especially its section that describes Unity 3D, needs more detailed explanations about supported OS's and VR hardware; some Unity packages, which need to be installed to build and run VR games, cannot be installed on certain computers such as an Intel-based MacBook Pro.
Changes that require ~10 hours to complete
On the Course Home Page, Quest 2 Practice Tutorial should include a section about common errors that students encountered while trying to run Google Earth VR on Oculus and how to debug them.
The Student Research section should be updated to include more recently research projects such as those that students did in 2022.
Each page should have information about when it was created and last updated since AR/VR technologies develop quickly, and after a short amount of time, contents on a page may not be relevant.
Collaborative VR software: Masterpiece VR
Masterpiece VR allows users to sculpt and paint objects with controllers while wearing a VR headset
Based on the product's demo video, users seem to be able to also copy-paste ready-made objects that exist by default within the app and use the objects as parts of their sculptures
The app's functionality called "Motion" allows users to adjust the position of an individual part of a figure they created such as the figure's arm
Users can also invite other users to a "virtual room" to view and interact with their artwork
Oculus website: https://www.oculus.com/experiences/rift/361221470668584/
Product website: https://masterpiecestudio.com/product
Read about Passthrough over Link for the new Meta Quest Pro headset, which allows users to see their surroundings as a video in the headset, while running a Unity project on a desktop with settings that allow the passthrough functionality
1/31/23 - 5 Hours
2-5 potential pieces of software to explore and evaluate for my research project
Unity: Researched about how RenderTexture and Texture 2D objects work, and how I can use Graphics.Blit to render stuff on a camera frame
Unreal Engine: Researched about how it's different from Unity and what it's useful for
Flow Immersive: Allows data visualization with maps in VR
2/1/23 - 3 Hours
Questions
What qualifies as scientific data? What academic disciplines should the data be related to?
Setup my Quest 2 headset
I can't see without my glasses (I don't wear contacts), and it was somewhat inconvenient to wear the headset on top of my glasses...
I already had the desktop and the smartphone versions of the Oculus app. I added my new headset to those apps.
Downloaded SteamVR and tried running Google Earth VR. "Visited" a few landmarks and CIT while running the game
Project ideas
1. Choose a few species of primates to create data visualization(s) about primate habitat distribution with functionalities that allow users to sort visualized contents based on species, diets, body type, and habitat (use Unity (or Unreal Engine) and Flow Immersive)
Use this dataset: https://zenodo.org/record/1342459#.Y9slPi-B1QI
2. Create a visualization with a timeline and a map (or some other elements that can represent countries) that show how the amount of carbon emission in each country changed over time (use Unity (or Unreal Engine), GeoJson and Flow Immersive)
Use this dataset: https://www.statista.com/statistics/264699/worldwide-co2-emissions/
3. Create a visualization with a timeline and a map of Providence and show average number of tweets from different areas in the city at a selected time period (use Unity, GeoJson, and Flow Immersive)
Use this dataset: https://www.microsoft.com/en-us/download/details.aspx?id=57387
2/6/23 - 5 hours
Google Earth VR
Neighborhood I grew up in
Brown apartment (this image needs an update on Google's database)
Place that's not famous but significant to me
Google Earth VR (Video)
In case the video below is invisible, please use this link to view it: https://drive.google.com/file/d/1tqW6mlFLsIoG8-NBSmGjLHBSMmb4tF8C/view?usp=sharing
Google Earth Web
Neighborhood I grew up in
Brown apartment (this image needs an update on Google's database)
Place that's not famous but significant to me
Google Earth Web (Video)
In case the video below is invisible, please use this link to view it: https://drive.google.com/file/d/150oBpspzZUwTg5IF2K-SoQx2t7qbXv6_/view?usp=sharing
Project ideas
1. Choose a few species of primates to create data visualization(s) about primate habitat distribution with functionalities that allow users to sort visualized contents based on species, diets, body mass, and habitat (use Unity (or Unreal Engine) and Flow Immersive)
3 things I'll do during the project
Plot primate habitat distributions based on species, diet, and body mass on a map
Visualize each habitat type in VR
Allow users to compare their plots
Class activity
Students explore different types of habitats together
Potential deliverables
Add in the VR Visualization Software section an entry explaining functionalities of Flow Immersive
Add in the VR Development Software section an entry explaining how to load GeoJson in Unity (possibly by using Unity libraries like Unity Web Request)
2. Create a visualization with a timeline and a map (or some other elements that can represent countries) that show how the amount of carbon emission in each country changed over time (use Unity (or Unreal Engine), GeoJson and Flow Immersive)
3 things I'll do during the project
Generate a 2D map in which each country has a different color based on the amount of carbon emission. These colors change as the user interacts with the timeline
Allow users to compare their maps, each showing the amounts of emission at different time periods
Get 360 degree images of chosen locations at different time periods and visualize their changes in VR with an interactive timeline (e.g., an area that is a forest in 2010 becomes a factory in 2020)
Class activity
Students can visualize changes at the chosen location mentioned above together
Potential deliverables
Add in the VR Development Software section an entry about building interactive UI elements with Unity
Add in the VR Development Software section an entry where I compare Unity and Unreal Engine, especially in terms of learnability and visualization quality
3. Create visualizations of different 3D color spaces (e.g., CIELAB, HSV, RGB) and see whether the distance between two chosen colors in those spaces matches perceptual differences between those colors
3 things I'll do during the project
Display on a 2D UI a list of color spaces the user can explore, and when the user chooses a color space, display the color space in VR
Allow the user to pan, zoom, and rotate the chosen color space in VR
Write a program that randomly selects a color in the chose color space and asks the user to find a color within the same color space that seems to have the largest contrast (in terms of hue, lightness ... etc.) with the randomly selected color. Once the user chooses a color, the program can show the difference between the user-chosen color and the color that actually has the longest distance from the randomly selected color.
Class activity
Each student can try choosing the largest contrasting color mentioned above as a group and learn about differences between color spaces that are uniform (perceptual distance between two colors matches numerical distance between those colors) and those that are not uniform
Potential deliverables
Add in the VR Visualization Software section an tutorial about integrating VR Point Cloud Editor with custom UI elements (e.g., a display showing the color of a chosen point)
Add in the VR Modeling Software section an entry explaining how to generate coordinates and other characteristics (e.g., colors) for points in a custom point cloud
Software evaluation metrics brainstorming
What is the highest image resolution that the software can visualize without lagging?
How easy is this software to learn?
How many tutorials about this software can I find on YouTube?
When was the most recent version of this software released?
How long has this software existed?
What are operating systems that this software is compatible with?
Potential software
VR Point Cloud Editor: for visualizing a 3D color space with each color as a point (many color spaces only use integer values to represent colors)
3D SLicer
Misc.
Installed DinoVR
2/7/23 - 3 hours
Took screenshots in DinoVR
Commented on another student's project idea on activity board
2/8/23 - 2 hours
Software to take a look at:
VR collaboration plug-in for Unity: https://www.photonengine.com/pun
Project plan: Primate habitat visualization
2/9 - 2/14
Make wireframes of all displays to be included in the visualization
Make a document that explains functionalities of all UI elements that will be added to the app
Make a document that contains all software I'll need to implement the displays and functionalities described above
2/15 - 2/16
Clean the primate dataset I found on 2/1 and convert it into a csv file so that I can load it into Unity using Unity Web Request
Modify future plans if necessary
2/17 - 2/23
Make a sample Unity project where I visualize a 3D globe in VR and allow users to rotate it and take a look at it from different angles
Plot on the globe one of the three types of data I want to visualize (e.g., primate population distribution by body mass)
Implement a collaborative viewing mode in the sample project
Make a 2D version of the globe described above using JavaScript
2/24 - 2/28
Using the sample I made for the previous milestone, build a user interface in a new Unity project where users can visualize different types of data (including primate population distributions by body mass, diet, and habitat type) with a UI element such as a dropdown menu
This interface will also have the collaborative functionality implemented for the sample project
3/1 - 3/2
Debug any issues that the visualization model from the previous milestone has
Modify future plans if necessary
3/3 - 3/7
Find 360 degree images (prefreably copyright-free ones) of different primate habitat types
Link the 360 degree images found above to specific locations in each map and allow users to load a 360 image upon interacting with a UI element at a chose location
Add a functionality that allows users to explore each habitat together
Make a 2D viewing environment for each of the 360 degree images described above
3/8 - 3/9
Debug any issues that the visualization model from the previous milestone has
Modify future plans if necessary
3/10 - 3/13
Test the model made from all the previous milestones with 1-2 other users and see if there's anything I need to changes
Implement the changes suggested from the testing sessions described above
In-class activity: Ask students to compare the VR visualizations with their 2D versions. Then, ask them to discuss pros/cons of each version in comparison with each other.
2/12/23 - 3 hours
Please note that I ended up completely changing the modes of visualization I'm trying to implement in this project
Project plan (updated)
In-class activity: Compare desktop vs. VR versions of the app and evaluate the following aspects:
Which version would you use to understand characteristics of different primate habitats?
Which version would you use to visualize and compare primate body mass?
Which version would you use to visualize diet distributions of primates that live in a certain habitat?
Wiki contributions/deliverables: Tutorials about VR Development Software:
How to use Unity Web Request (C#) and Fast API (Python) to display different data upon each UI interaction
How to make interactive objects for Oculus controllers in Unity
How to load a 360 degree panoramic image in Unity
Collaborative functionalities:
Users can invite other users to collaboratively explore a habitat
In a primate body mass visualization, each circle's size corresponds to the real-life body mass of a primate. Users move each of the circles and place them at different places in the habitat. When a user moves a circle, all other users in the habitat can see the change in the body mass visualization.
Scientific Data
Data to use
CSV files showing, habitat types, average body mass, diets of 500+ primate species
Along with information about the three categories described above, the files also contain information about family, genus, common name, and species of primates
Software to process data and render data visualization
I'll first clean the dataset using Numbers (e.g., by deleting species whose data are missing)
I'll merge the CSV files and make 2D visualizations of data using Tableau
I'll use Unity Web Request (C#) and FastAPI (Python) with a server running on my desktop (or somewhere else) to fetch data upon user request
Schedule: Modify future plans if necessary at each milestone
2/14
Finish wireframes of all displays to be included in the visualization
Make a document that explains functionalities of all UI elements that will be added to the app
Make a document of all software that I’ll need to build the app
2/16
Clean the primate dataset I currently have and extract data about the following:
Number of primate species that live in 3 selected habitats
Diet type distribution (in percentage) in each of the 3 habitats
Average body mass of selected primate species that live in each of the 3 habitats
2/23
Find 360 degree panoramic images representing each of the 3 habitats
Make a Unity project where I can do the following:
Select one of the 3 habitats on the home screen
Visualize the selected habitat as a 360 degree panoramic image
2/28
Write the VR Software Wiki entry about loading 360 degree panoramic images into Unity and debugging a Unity project using ADB
To each of the 360 degree images, superimpose a user interface and make sure this interface is visible regardless of the user’s head position
Using Python and C#, write programs that load csv data into Unity when users specify a habitat and visualization mode (body mass or diet) in the user interface
Link the ”Back to Home” button in the interface to the starting screen of the app
3/2
Debug any issues in the Unity project
Write the VR Software Wiki entry about how to load JSON data upon a user commmand
3/7
Create the Savanna habitat with Unity assets
Make separate viewing spaces for data
Modify visualization mode so that there's only one visualization that shows primate body mass and diet type together
Find a similar software that visualizes habitats in VR
3/9
Debug any issues in the Unity project
Do the in-class activity described above
3/16
Implement some of the suggested changes during in-class activity (sphere visualization in habitat, axis labels)
2/13/23 - 1 hour
Modified project plan again
Project plan evaluation
The proposed project clearly identifies deliverable additions to our VR Software Wiki
4 (Loading a 360 degree image into Unity may turn out to be a bit too simple. This may be modified later.)
The proposed project involves collaboration in VR
5
The proposed project involves large scientific data visualization along the lines of the "Scientific Data" wiki page and identifies the specific data type and software that it will use
4 (I need to see if multiple users of my Unity app can connect to a server running on my desktop when fetching specific data)
The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class
4 (I may have too many things to do in the last week of February. I'll need to adjust the schedule accordingly)
The proposed project explicitly evaluates VR software, preferably in comparison to related software
5
The proposed project includes an in-class activity, which can be formative (early in the project) or evaluative (later in the project)
5
The proposed project has resources available with sufficient documentation
3 (I still need to figure out how VR collaboration involving Unity's 3D objects work)
2/15/23 - 2 hours
Cleaned data in csv files so that species that have no data or do not live in any of the 3 chosen habitats (forest, savanna, wetland) are deleted
Modified the wireframe for diet visualization
All materials I have regarding Project 1 can be found here: https://drive.google.com/drive/folders/1e9hgKex9VMfXQ4uRkRhXcFZPF2d6HSwz?usp=sharing
Journal evaluation:
Journal activities are explicitly and clearly related to course deliverables: 3
deliverables are described and attributed in wiki: 3
report states total amount of time: 5
total time is appropriate: 4
2/23/23 - 5 Hours
Still looking for suitable 360 images (preferably in JPEG file) to use
This website seems to have a lot of free 360 degree images: 360° Hdri Panorama Overview | Openfootage
Followed the tutorial below and loaded a 360 degree image into a sample project in Unity: How To Make a 360 Image Into a Skybox In Unity - YouTube
Made a UI for Oculus by following this tutorial: Create a Canvas Pointer in Unity for Oculus - Bing video
Screenshot of the UI
Linked a button in the home screen to a scene where a 360 degree image of a forest is loaded. Then, I added a button in the forest scene that takes the user back to the home screen. The UIs look like these in VR (blue lines are from my controller).
I referenced this tutorial for scene management: Mini Unity Tutorial - How To Use Scene Manager - Beginners Tutorial - YouTube
2/26/23 - 4 hours
Modified schedule slightly (see the red highlighted text in the schedule from 2/12), because loading a 360 degree image into Unity turned out to be easier than I thought
Completed 360 degree image - UI for my VR app integration for all 3 images
2/27/23 - 4 hours
Made presentation for tomorrow
Implemented a pipeline in which a Python program extracts specific data from csv files on my computer upon user command in my VR app and sends the data back to the app
Pushed the current version of my code to this repository: ashleykwon/PrimateDataVisualization: Project for Brown University's CSCI 1951T (Spring 2023) (github.com)
I'm still trying to figure out whether I need to modify my visualization type to be better suited for VR
Will decide what type of shape to spawn once I figure out suitable visualization types
3/3/23 - 5 hours
Decided to update my VR app's UI. Here're my new wireframes and functionalities I'm trying to implement
I wasn't sure how to establish multi-user connection to a server running on my desktop computer. So I decided to save JSON files along with my app data and load them using C# code, instead of loading them from a server.
I wrote a program that does the tasks described above, but it's still buggy, because I can't see spheres being spawned when I run the app in my headset.
3/5/23 - 3 hours
Still trying to fix the data point visibility issue.
3/6/23 - 6 hours
Figured out what was causing the issue I faced on 3/3 and 3/5 where I couldn't see data points generated as spheres in Unity. I had to move my JSON files into the Assets/Resources/ folder and use Resource. Load to load them.
Will write a Wiki contribution about how to load JSON files into a Unity project
3/7/23 - 6 hours
Added a script to the right joystick that moves the player.
Added a script in each visualization scene that spawn spheres with different colors whenever the user accesses the scene
3/8/23 - 7 hours
Combined the 2 modes of visualization, each for diet and body mass, into one visualization. I initially thought of visualizing body mass as spheres lined up horizontally, but when I tested the visualization in my headset, it took too long to get to the last sphere, especially when there were 300+ spheres in the scene
I decided to stack spheres into 3-6 separate vertical lines with each line representing a diet group
3/10/23 - 2 hours
Wrote wiki contributions about debugging a Unity project and loading JSON files in Unity
3/15/23 - 5 hours
Filled out self evaluations
Please refer to this repository for all programs I wrote + scene setups: ashleykwon/PrimateDataVisualization: Project for Brown University's CSCI 1951T (Spring 2023) (github.com)
Please note that you'll need to download TerrainTexturesPackFree from the Asset Store and load it into the project for the Savanna visualization to work
Made the following changes to my project based on suggestions from my in-class activity's survey results:
Fixed the camera clipping issue in savanna
Added labels to the x-axes of all visualizations
Changed the mode of visualization for forest:
I combined the habitat visualization with the sphere visualization as in the photo below:
Here're some videos from my final product, which uses the following three modes of habitat + primate data visualization:
Forest: data points integrated into a scene with 360 degree images of a forest
Savanna: data points visualized in a separate scene + habitat made with Unity assets
Wetland: data points visualized in a separate scene+ habitat made with a 360 degree image
Main interface
Savanna
Wetland
3/16/23 - 3 hours
Added 2 more wiki contributions about making a custom terrain in Unity and converting CSV files into JSON files in Python
3/20/23 - 0.5 hours
Project 2 idea: Visualization of color spaces
Although there are many different color spaces, including the ones that are perceptually uniform (distance between two colors in a color space matches human eyes’ perceived difference between two colors), color values need to be converted to RGB values to be visualized on a display (e.g., Unity apps)
In this project, I aim to visualize as a 3D point cloud different color spaces such as CIELAB, HSI, HSV … etc. and see how a color value in one place would be mapped to an RGB space
Users will be able to choose a color value in one space and visualize what it looks like when it’s mapped to an RGB space
For my in class collaborative activity, I aim to create a passthrough environment in which participants have a collaborative viewing experience of a color space
Deliverables may include wiki contributions about implementing passthrough, creating interactive components in Unity VR … etc.
3/22/21 - 2 hours
I decided to change my project idea:
Overview: simulate butterfly, dichromatic primate (red and green), and trichromatic primate vision with 3D objects in VR + passthrough.
Wiki contributions
Implementing passthrough in Oculus
Creating a multiplayer viewing environment in Oculus
Integrating VR object into a passthrough scene
Milestones (Modify future schedule at each milestone):
4/04
Find five 3D objects to visualize
Make wireframes for the application
Make a document that lists all functionalities I need in the application and software I need to implement them
4/06
Implement passthrough layer
4/11
Add color changing functionalities to the 3D objects I found on 4/04
4/13
Make a document listing resources about how to implement collaborative viewing functionalities for 3D objects with passthrough
In collaborative mode, users can see the same object from different perspectives depending on where they are located in a room
4/18
Implement collaborative viewing functionalities
4/20 to 4/25
Test the collaborative viewing functionality with 1 to 2 other people and debug if necessary
4/27
Write wiki contributions about passthrough + collaborative viewing
5/2
In class activity: visualize the same 3D object and view it from various angles, while also changing their colors depending on selected vision mode
5/4
Make changes to the application based on feedback from the in class activity
4/3/23 - 0.5 hours
Went over the seven scenarios paper again
These are questions that my VR system can answer:
Collaborative data analysis: evaluate whether users can view a 3D point cloud collaboratively from multiple angles and analyze its characteristics
Communication through visualization: evaluate whether my VR system is useful as a learning tool about different types of animal vision
User performance: evaluate (possibly through measuring user response time) how well my visualization tool performs in helping users understand its functionalities compared to another tool with similar functionalities
4/9/23 3 hours
Made wireframes for the VR animal vision app, which can be found here: https://www.figma.com/file/WrjJf3exNySVdiUEYIsWXZ/Untitled?node-id=0%3A1&t=RYaF11qwj71QW5c9-1
Looked for possible software/packages I can use
Oculus integration passthrough tutorial: https://www.youtube.com/watch?v=3H-KUyKwVD4
Photon
Google VR Multiviewer: https://developers.google.com/vr/reference/gvr-ndk-rendering
4/10/23 - 5 hours
Learned about shader programming by following this tutorial (Unity doesn't use GLSL; it uses a different shader language): https://inspirnathan.com/posts/47-shadertoy-tutorial-part-1/
I can possibly use shaders to change point cloud colors in Unity
4/20/23 - 5 hours
Implemented Passthrough + Multiplayer mode in Unity and pushed all changes to git.
Decided to modify my wireframes as in the image below, based on functionalities that Photon allowed for a Unity project:
4/23/23 - 3 hours
Looked for academic articles about differet types of vision. I managed to find ones about color blindness in humans that described trichromatic to dichromatic vision conversion, but I couldn't find ones for animal vision. I'll probably change the project title to "Vision Simulator" and build collaborative learning environments about color blindness.
These are the articles I found whose formulas worked well with C# code
Protanopia, Deuteranopia: 1711.10662.pdf (arxiv.org)
Tritanopia: Martin Krzywinski - Designing for Color blindness - Canada's Michael Smith Genome Sciences Centre (bcgsc.ca)
4/29/23 - 8 hours
I somehow couldn't get interactive UI elements in my VR/AR interface. I ended up changing the design so that I have 2 collaborative viewing rooms,
One with passthrough
One without passthrough but with web connection through Photon
Each room has 4 RGB spaces visualized as cubes, each representing protanopia, trichromatic vision, deuteranopia, tritanopia.
5/1/23 - 8 hours
Trying to solve these issues
The app keeps crashing -> Resolved by putting less points in each viewing room
Can't get UI elements to interact with the controllers -> Resolved by adding the OVRRaycaster script to Canvas and changing the canvas' layer to Default based on this video: Create a Canvas Pointer in Unity for Oculus - Bing video
Player hand looks too large in the collaborative viewing room without passthrough
Made the in-class activity handout
5/2/23 - 2 hours
Made the final presentation for project 2 and completed in-class activities
5/3/23 - 3 hours
Made a Wiki contribution about implementing passthrough in Oculus
Made the final presentation for Project 2
5/4/23 - 4 hours
Fixed the issue where the app with the shader couldn't be installed in my headset without crashing (my disk was too full)
Downloaded an apple prefab from Unity's Asset Store and added a shader to it. Still have 2 vision modes (blue blind, red blind) to implement.
5/8/23 - 8 hours
Implemented the remaining 2 vision modes
Changed the app functionality based on these suggestions from people during my in-class activity:
Difficult to tell which RGB space represents what type of color blindness.
Would have been better if there had been a 3D object that shows an example of what the world looks like with a certain type of color blindness.
Changed my Git repository name since the app is no longer about animal vision.
Updated Project 2 presentation
These are photos and videos from the vision simulator app:
Main Interface
Photon viewing environment
Passthrough viewing environment (Please note that it's not possible to capture passthrough layer with Oculus' internal cameras)
5/10/23 - 3 hours
Wrote wiki contributions and updated journal
5/12/23 - 3 hours
Made final presentation poster
5/15/23 - 4 hours
Edited the final poster I made on 5/12 and linked it at the top of this journal
5/16/23 - 2 hours
Made a wiki contribution about Unity shader
5/17/23 - 1 hours
Filled out self evaluation and completed public demo instructions