Colby Rees' Journal
Course Learning Goals
before after
---- ----
1 | 4 | Goal 1: articulate AR/VR visualization software tool goals, requirements, and capabilities
1 | 4 | Goal 2: construct meaningful evaluation strategies for software libraries, frameworks, and applications; strategies include surveys, interviews, comparative use, case studies, and web research
1 | 4 | Goal 3: execute tool evaluation strategies
1 | 4 | Goal 4: build visualization software packages
1 | 4 | Goal 5: comparatively analyze software tools based on evaluation
1 | 4 | Goal 6: be familiar with a number of AR/VR software tools and hardware
2 | 5 | Goal 7: think critically about software
3 | 5 | Goal 8: communicate ideas more clearly
1 | 4 | Goal 9: understand when to use Mixed Reality and passthrough capabilities to enhance data visualization
PROPOSALS
Project 1 Proposal: Project 1 Proposal Plan
Presentation for Project 1 Proposal: Link to presentation
Presentation for Project 1 Update: Link to presentation
In-Class Activity: Link to activity (activity includes instructions on how to download my spatial anchor and grab and scale implementations)
End Presentation for Project 1: Link to presentation
Project 2 Proposal: Project 2 Proposal Plan
Presentation for Project 2 Proposal: Link to presentation
Poster <ADD LINK>
In-class Activity <ADD LINK>
Public Demo <ADD LINK>
ALL OTHER WIKI CONTRIBUTIONS
CONTRIBUTION 1 Spatial Anchor page linked here (created)
CONTRIBUTION 2 Added Smithsonian Open Access to Scientific Data page linked here (updated page)
CONTRIBUTION 3 Spatial Anchors in Unity tutorial page linked here (created)
CONTRIBUTION 4 Meta Building Blocks in Unity page liked here (created)
CONTRIBUTION 5 Grab Interactions in Unity page linked here (created)
CONTRIBUTION 6 AR Visualization Comparisons page linked here (created)
CONTRIBUTION 7 Sideloading APKs Using SideQuest page linked here (updated a photo)
.....
CONTRIBUTION N [short description] <ADD LINK>
Google Earth VR
House where I grew up
Brown Apartment
Significant Location

Recording of journey from penultimate location to the final location on Google Earth VR
Google Earth Web
House where I grew up
Brown Apartment
Significant Location

Recording of journey from penultimate location to the final location on Google Earth Web
HOURS SUMMARY
Total: 84.67 hours
HOURS journal
1/24/25 - 1.5 Hours
Set up my individual journal.
Joined the class slack and introduced myself.
Reviewed the course homepage and some previous course projects.
1/25/25 - 3.5 Hours
Read through the wiki and identified nine potential changes
Three changes should each require ~10 minutes to complete:
On VTK page, remove VTK Mailing List link (it no longer points to an active webpage) and replace it with the VTK Community Support page. Also, adding a link to the VTK “Getting Started” webpage for the installation guide. COMPLETED
Adding an outline/description to the VR in Psychology page and VR in Medicine page.
Under Applications of VR, there is a list of applications in VR that links to the respective pages but it does not include links to all of the subsections. This should be updated so the list reflects all applications of VR pages.
Three changes should each require ~1 hour to complete:
On the VR Development Software page, provide an overview of what VR Development Softwares are. It currently goes directly into providing examples.
On the VR Modeling Software page, it also goes directly into examples. To give readers a better understanding of what they will find and the usefulness of VR Modeling Softwares, add an overview and update the Comparison Table.
On What is VR page, include more papers and conference information.
The final three changes should each require ~10 hours to complete :
On the VR Visualization Software page, update the Features and Time Taken to Complete Tasks with all Visualization Software that is included in this section. Currently, these comparison charts only represent 5 softwares.
Under Applications of VR, add pages on VR in Agriculture. When doing online research, I came across multiple different scientific research projects in agriculture that are utilizing VR.
Interesting links I found discussing VR in agriculture:
Add specific information about passthrough capabilities for different softwares. A review of all the softwares listed already in the Wiki would be needed and then, check if there have been updates in relation to passthrough.
Completed the first 10 minute task (marked completed above).
Looked further through the Wiki to understand further software use and looked through past students' projects.
Began researching collaborative VR software.
1/26/25 - 1.5 Hours
Read Kenny Gruchalla's bio and added questions to the board
Researched passthrough technology and Mixed Reality
Researched collaborate VR software
Sentio VR focuses on architecture and design - allows for collaboration in real-time to review designs, annotation, and measurements
It seems more like the users have the ability to comment on designs in real time instead of actually designing collaboratively in real time.
Vizible is a VR collaboration software that allows for users to create their own spaces, gather in spaces, and replay interactive session to give users the ability to edit
1/28/25 - 0.5 Hour
Set up my Meta Quest 3 up to connecting to the virtual machine step
1/29/25 - 4.5 Hours
1.5 hours
Finished setting up my Meta Quest 3
Read through multiple previous student projects, including:
Added the course learning goals to my journal with values for where I am now with the goal
3 hours
Exploring potential software to use for my project
I spent the most time on this step as I was trying to find software that allowed for passthrough capabilities. I kept finding that the available software with passthrough and mixed reality is mainly focusing on virtual meeting spaces, gaming, and design. I also spent time looking at data visualization software on the Wiki and other online research to get a better idea of the capabilities. Most of the below softwares I am still unsure on if/how passthrough could be used but I think this will be interesting to further look into.
Software I noted:
Project ideas:
Represent predicted sea level rising over time (data for NYC here: https://data.cityofnewyork.us/Environment/Sea-Level-Rise-Maps-2050s-100-year-Floodplain-/hbw8-2bah) and use augmented reality to show how high up the water level would be in the room the user is in.
Comparing two data visualization softwares ability to interact with items in your physical space (depicting airflow or heat or sunlight and how the walls of the room change its behavior).
Comparing data visualization softwares and their ability for multiple users to collaborate on the same visualization at the same time (This would be on a data set where it makes sense for changes to be occurring, will need to brainstorm this idea further - I am also interested in transportation data which could be incorporated here).
2/1/25 - 2.5 Hours
1 hour
Completed Google Earth activity, updated my journal with the VR and Web photos, and completed comparison form
Further explored on Google Earth VR (this was very cool!)
30 min
Completed Bezi Lab
1 hour
Installed DinoVR on paperspace machine
Read DinoVR paper
I found this paper very interesting and important when considering the user experience in VR. While not directly the topic of this paper, I started to think more about accessibility within VR scenes and how understanding different people's visual abilities helps in developing accessible applications with options to accommodate more users.
How can text be best configured when using augmented reality? I am wondering how text could be adaptable for good readability when a user has passthrough in use.
2/3/25 - 4.5 Hours
Looked through more previous students' projects to get inspiration for my project
Research, on the wiki and on other internet sources, data visualization software that has passthrough / augmented reality capabilities
Many of the softwares and tools I came across cost a lot of money or were in development. This helped me learn about the capabilities that are available but I need to continue searching for software that is more accessible for my project. I think including accessibility of a tool is an important feature of the wiki.
Software / tools I found interesting:
BadVR - While this does not seem accessible to get my hands on, I still think it is useful to add some information on it to the wiki and plan to do so during this week
Looked into potential datasets:
Also looked at data broadly via the links on this site: https://www.societyforscience.org/research-at-home/large-data-sets/
Project Idea 1: Compare two applications for their ability to allow users to collaborate on a data visualization while the users are in the same space. Using passthrough to see other headset wearers and placing the data representation in the room with them.
Things I will do during the project:
Investigate best tools that allow for passthrough capabilities and understand what kind of data could benefit from this representation (so far the software I have encountered is costly so I will be trying to find software that is affordable/free)
Implement data visualization in the software
Experiment with collaboration capabilities
Class activity for the project: Have classmates pair up and try collaborating in each of the two applications. They will fill out a form after their experience.
Potential deliverables: Under the VR Visualization Software tab of Wiki, there should be a passthrough specific tab. Here I will add a Wiki page comparing the two different software tools/packages. Including general information as well as difficulty level for collaboration and quality of experience reports. Add to scientific data page on the data I use and the ease / quality of the scientific data.
Project Idea 2: Use a physical static object in a room to be where the data visualization will always appear and have users be able to move around the object. Compare this visualization experience with an experience without an physical object.
Things I will do during the project:
Investigate VR headset recognizing a physical object to be able to place a data visualization on
Create data visualizations for the comparison
Conduct a comparison with physical object vs just data visualization alone
Class activity for the project: Class will experience both situations, the data visualization on the physical object and the visualization alone. They will then fill out a form on what they preferred more from each scenario.
Potential deliverables: Under the VR Visualization Software tab of Wiki, there should be a passthrough specific tab. Here I will add a Wiki page on comparison between the two scenarios, including analysis of class activity feedback. Wiki contribution on how to use physical objects in data visualization and the tools to do so. Add to scientific data page on the data I use and the ease / quality of the scientific data.
Project Idea 3: Comparing if users prefer to view data in VR with passthrough or without passthrough. Motivated by the question of what kind of data representations can benefit from being placed in augmented reality and when it is not effective.
Things I will do during the project:
Investigate software options for data visualization with passthrough (comparing costs and capabilities could be a Wiki contribution)
Create data visualizations for the comparison
Conduct a comparison of passthrough visualization vs just data visualization alone
Class activity for the project: Have classmates view data with and without passthrough/augmented reality. Then, fill out a form on what they liked and disliked about both experiences. When did they feel most immersed, when is the data was most clear to them.
Potential deliverables: Under the VR Visualization Software tab of Wiki, there should be a passthrough specific tab. Here I will add a Wiki page on comparing data visualizations with passthrough and without including the different scenarios and capabilities each is more useful in. Add to scientific data page on the data I use and the ease / quality of the scientific data.
All three of my ideas are similar, each focusing on different ways to include passthrough capabilities with data visualization. I am still trying to figure out what kind of software would be best to be able to do my project. I will likely be attending this week's TA hours to gain more clarity on this component. For the data set component, I feel that since the projects are centered on capability over a specific feature of the data, I will have some flexibility with deciding the data set to use but need to think about what kind of data would benefit most from these scenarios.
Brainstorming software evaluation metrics:
Cost
Ease of installation
How long to get started
Code vs low code vs no code
Data it works best with
Existing resources to help
2/5/25 - 2.5 Hours
Completed the DinoVR feedback form
Attended TA hours to get a better idea on my project idea and how to best build upon the idea
I was initially struggling with landing on a project idea because I was thinking about a dataset first and then trying to figure out how that dataset would work within AR. I have decided to take a research approach to how humans interact with AR by focusing on QR codes as spatial anchors. I am thinking if / when will it be easier for someone to navigate a physical object over using controllers and how this could be used in a collaborative space where everyone better understand what each other is looking at / from what angle.
Updated my project concept in the activity board.
Provided feedback on other students' project ideas (Eunjin and Connor).
Developed outline for project plan:
2/11: Identified software / tools where QR codes are used as spatial anchors and identified data representations to be used on the spatial anchors
2/13: Research on the use of spatial anchors and where they are currently being used
Start working with identified software / tools and data representations
2/20: Wiki page created on spatial anchors’ current uses
2/25: Created QR code as spatial anchor with 3D representation
2/27: In class activity ready for class to experience using the spatial anchors and gain feedback
3/04: Make adjustments to how to better the human experience of this scenario
3/06: Wiki page created on the pros and cons of using QR codes as spatial anchors and how to use QR codes as spatial anchors
Started researching spatial anchors
Melvin shared this helpful link: https://developers.meta.com/horizon/documentation/unity/unity-spatial-anchors-overview/
2/8/25 - 3.25 Hours
15 min - Added my DinoVR screenshots to the activity board
3 hours
Researched spatial anchors in AR for the Meta Quest and added the resources and references to my project plan
Created a Wiki page for my project 1 proposal plan and inked my project 1 proposal plan to the top of my journal
Through my research, I found that while other headsets may allows for QR code scanning while in an app, the Meta Quest does not because of privacy concerns. They only allow QR code scanning for setting up WiFi as noted at the bottom of this link: https://www.meta.com/help/quest/1503826183789419/
I also read forums discussing this capability and how it is not currently possible because of privacy
Updated my project plan to be using spatial anchors that the user selects, as detailed here: https://developers.meta.com/horizon/documentation/unity/unity-spatial-anchors-overview/
Added further detail on my in-class activity, milestones, Wiki contributions, and deliverables
Created a plan for tomorrow which includes solidifying my data to be used in my visualization
2/9/25 - 2.75 Hours
1 hour - Researched for data to be used in my project and found the Smithsonian Open Access which contains open access 3D models of a variety of museum items including bones, plants, aircrafts as well as models of space including exploded stars
Identified the three models I would like to use in my project in order to be able to not only compare spatial anchors in AR vs the Smithsonian mobile AR feature - I plan to also determine if there is a type of data that is better in each scenario
Added the Smithsonian Open Access to the scientific data page on the Wiki
1 hour - further developed and refined my project plan including adding scientific data information, making sure the 3D models will work in Unity by checking file type, and adjusted my milestones
Completed self-feedback rubric:
The proposed project clearly identifies deliverable additions to our VR Software Wiki
Rating: 4
Notes: My project proposal wiki page contains a table that includes the wiki pages I plan to create and when I plan to do so.
Involves passthrough or “augmented” in VR
Rating: 5
Notes: I am using spatial anchors and passthrough is needed for this to function as planned.
The proposed project involves large scientific data visualization along the lines of the "Scientific Data" wiki page and identifies the specific data type and software that it will use
Rating: 4
Notes: I identified my data as three 3D models from the Smithsonian Open Access library that range in scientific disciplines. I will be using Unity in order to initiate spatial anchors and my data is in a Unity compatible file format (.obj). I have identified Unity tutorials that I linked as resources to my project plan and I plan to follow them to begin using spatial anchors with my data visualizations.
The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class
Rating: 4
Notes: If development takes longer than anticipated, then my timeline is not perfectly accurate but I have built in time to mitigate this situation. My project plan includes specific milestones as well as links to tutorials I will follow when scheduled to. I allow time to analyze thoughts in order to contribute meaningful analysis to the Wiki.
The proposed project explicitly evaluates VR software, preferably in comparison to related software
Rating: 4
Notes: I am evaluating spatial anchors in AR by using Unity and I will be comparing this the Smithsonian mobile AR feature. I am also comparing the different ranges in 3D models to understand what type of data works better in each situation.
The proposed project includes an in-class activity, which can be formative (early in the project) or evaluative (later in the project)
Rating: 5
Notes: In class activity is included at the top of my project plan page. This will be important later in my project in order to understand my classmates’ experiences in each scenario. I will be using this feedback, as well as my own analysis to create a comparison table for the Wiki.
The proposed project has resources available with sufficient documentation
Rating: 4
Notes: In my project plan, I include many resources on the software I will be using, tutorials to understand the software, and information on the capabilities when using the Meta Quest 3. I have documentation for passthrough, spatial anchors, shared spatial anchors (which is a potential thought for project 2), and Unity.
45 min - Created presentation for Tuesday's class and practiced to be 3 minutes
2/12/25 - 2.75 Hours
15 min - Journal evaluation
Journal activities are explicitly and clearly related to course deliverables
Rating: 5
Notes: Journal activities related to the homework assignment for the upcoming class. Now that project 1 has officially started, I have made sure to state what activities are for my project.
Deliverables are described and attributed in wiki
Rating: 4
Notes: I have links at the top of my journal for specific project deliverables. For homework assignments, I make sure to include which assignment I have completed as well as the location (if there is a location for the task) the information is stored. As I start to make more wiki contributions as my project progresses, I will update the “All other wiki contributions” portion to reflect this.
Report states total amount of time
Rating: 5
Notes: total time is reported, daily time is reported, and where appropriate, time is broken down by activity
Total time is appropriate
Rating: 5
Notes: I have gone through to check and my total time adds up correctly.
30 min - Updated project plan to include milestones up to 3/13 for the final presentation (originally I thought the last day of the project was 3/6)
Updated class activity data on class timeline to 3/4
2 hours - Worked to accomplish milestones for 2/13
Researched spatial anchors for Meta Quest and implementation (included important links to resources in my project plan page linked at the top of my journal)
Tested Smithsonian visualizations on their mobile website in AR mode to understand the mobile capabilities
This also helped me think more clearly about how my implementation with spatial anchors can help with understanding the data as the mobile version is not very immersive
Created page in Wiki for Spatial Anchors here (no information added yet, content will be added in future weeks)
2/17/25 - 4.25 Hours
1 hour
Set up Unity for XR development by using the following tutorial as guidance https://developers.meta.com/horizon/documentation/unity/unity-project-setup
It took some time for everything to download but I was able to get Unity set up for my next steps
2.5 hours
Follow a passthrough tutorial for Unity
I initially planned to use this tutorial https://developers.meta.com/horizon/documentation/unity/unity-passthrough-tutorial but I ran into an issue towards the end where I continued to get different errors
I then did future research and recalled Unity building blocks. I followed a tutorial for setting up passthrough in Unity by using the building blocks and this was more straightforward (https://developers.meta.com/horizon/documentation/unity/unity-passthrough-tutorial-with-blocks/)
I was able to simulate a sphere in the center of the room with passthrough activated. I used the Meta XR Simulator, as suggested by Meta, to see what I created in a simulated space on my desktop instead of having to put my headset on and off. Then, after some trial and error, I was able to view the sphere I placed in my scene through my headset with passthrough activated. This YouTube video was helpful for understanding building blocks further: https://www.youtube.com/watch?v=PmDPQe-mnnI.
45 min
Updated Spatial Anchors page to include a description and information on use in Unity. I will continue to build this content as I learn more through my research. I also included necessary resources and references.
Created Spatial Anchors in Unity Tutorial page. I have not added content yet but I will update to include the methods and steps I go through to implement spatial anchors.
2/18/25 - 25 min
25 min
Completed installation steps for Paraview and Ben's Volume Rendering applications
2/22/25 - 4 hours
1 hour
Completed in class activity for Ben's Volume Renderer
I had a lot of difficulty before I was able to see the visualization within VR. It took me many attempts of re-entering paperspace, entering and exiting SteamVR, and trying to open any of the visualizations. I'm not exactly sure what was not working but I finally was able to see the visualization at the end. I updated the activity board with my screen shot and I completed the feedback form.
3 hours
Worked on project milestones
As I go through each milestone, especially as I am now entering into Unity development, I am making sure to take frequent photos and videos so I am well prepared for future presentations and also they act as a past reference point for myself.
In Unity, I have now gotten spatial anchors through the spatial anchors building blocks to work in my test. I was able to set spatial anchors in my room, have an item appear at these anchors, and exit the program. When I reenter the program, these anchors are remembered. I have found it particularly interesting to have the same item be at different angles so I can compare each of them at once. I also like that I can walk around the object.
My next goal is to import my 3D models into Unity so I can test them in the space. So far, I have only been able to get one visualization to appear at once so I will be working towards getting all the representations anchored.
As I have been working with the Unity Building Blocks, I think adding a Building Block page to the Wiki will also be a next step for me to develop as I continue my project
2/23/25 - 4.75 hours
2 hours
Updated Wiki pages
Updated Spatial Anchors in Unity wiki page with information on spatial anchor description, in Unity specifically, overall capabilities, and collaborative spatial anchors
As I have been developing spatial anchors, I have been using the Unity Building Blocks. I have updated my project plan to include creating a Wiki page on building blocks and plan to do so by next week
Updated Spatial Anchors Tutorial wiki page with an overview, requirements, passthrough tutorials, and spatial anchor steps. I included links to the material I reference and provided other tutorials if users are not using Building Blocks. I will continue to add onto this page as I continue developing.
45 min
Creating presentation to update the class on my project 1 progress. Presentation linked here as well as at the top of this page under project information
2 hours
Worked on project development in Unity. I have imported the Smithsonian models I plan to use and have been able to have all three models appear at once, anchored at different anchors. This took me a while to figure out, I was only able to have one model show for some time but I now can have all three. I am experiencing the build not recognizing my controllers every time which is leading to glitches so I will be working on fixing this in the future if possible. I have taken screenshots and videos to include for my in class presentation.
I need to learn how to export my project once finished in a way that can be used for my in class activity.
2/25/25 - 1.5 hours
1.5 hours
Created a Meta Building Blocks in Unity Wiki page that includes details on the current available building blocks, their given descriptions, and how to use the blocks within Unity
When creating this page, I encountered even more Building Blocks and better understand the many options they give Unity developers. If timing allows for my project, I am going to try implementing a few more of the Building Blocks in my project to see if this can enhance it.
2/26/25 - 2.5 hours
1 hour
Researched different ways to export my Unity project (the apk file) so I can have the class use it during the in-class activity
Read through past in-class activities and found multiple students using SideQuest: Sideloading APKs Using SideQuest
Messaged Melvin on Slack about this and he helped confirm that this was an effective method
1.5 hours
Continued developing in Unity
I was experiencing some controller glitches but I updated the buttons needed to spawn the anchors, and I have had a smoother time
I tried incorporating some other Building Blocks to enhance my project but the ones for collaborative spatial anchors require a Meta organization ID that I do not think I have - with the timeline on the project and that this was an extra idea, I think I will save this for the future for implementation (potentially project 2)
2/28/25 - 2 hours
1.5 hour
Added a controller instructions into my Unity project so users will have the controller commands visible to them while using it
Researched the grabbable building block and considered that being able to compare spatial anchors and grabbing and placing items to be useful for understanding user expierence with different data models
Started to implement the grabbable building block
0.5 hour
I had a friend who has not used VR before try out my spatial anchor project and used her feedback to adjust scale. This also helped me in understanding how to instruct someone and make sure the controller commands were clear.
3/1/25 - 5.5 hours
0.5 hour
Refined my spatial anchor visualization which included scaling and did multiple tests of placing and removing the three data models
3.5 hours
Developed and implemented an additional scene in my Unity project that allows users to grab and place two of the models in their space. I incorporated the ability for them to scale the models and rotate.
This took me a long time to get working correctly, I tried this with all three of my models but it was too busy looking and led to visual glitches so I decided on using only two of the models for this.
I think it will be useful for my in-class activity to have half the class do the grabbing and scaling and half the class do the spatial anchors to compare the usability, mental strain, and overall functionality.
I will add a new Wiki page on the grabbable building block that includes a tutorial - I plan to add this within the next few days as I figure out the best steps to include.
1.5 hours
Added grab interaction in Unity Wiki page - will need to update with tutorial and more descriptive information
Updated my existing Wiki pages with new information I have learned and added clarity
3/2/25 - 6.5 hours
0.5 hour
Created a Spatial Anchor Comparison wiki page to compare using spatial anchors with using grab interaction and with using the Smithsonian mobile AR feature
4.5 hours
In order to have both my spatial anchor scene and my grab interaction scene get tested in class, I needed to put them in different Unity projects. This took a long time of loading and testing to make sure all elements transferred over properly.
Updated a screenshot in the Sideloading APKs Using SideQuest Wiki page to reflect the updated Meta interface
1.5 hours
Preparing my project for the in class activity by making sure SideQuest works with my files, writing instructions for the class to follow (especially because we will need to have steps completed before class)
Creating my survey for class, currently I think it may be too long so I will be testing and reviewing this before my activity on Thursday in order to make sure I am able to get good results for analysis as well as not take up too much class time
Planning on meeting with Aarav to review and test my activity before Thursday
3/3/25 - 2.5 hours
1.5 hour
Updated survey and instructions for my in-class activity
Added two tutorials to the Grab Interactions in Unity page
1 hour
Met with Aarav to test run both of our in-class activities
Updated my instructions for more clarity
3/4/25 - 2 hours
1 hour
Updated survey and instructions for in-class activity to include the mobile AR website activity
My activity and survey are long but I think most students should be able to complete at least two of the three activities in class which should provide me with a good amount of information for analysis
1 hour
Reviewed and tested my activity and survey
Posted the pre-class steps onto the Wiki timeline for the class to complete
3/9/25 - 3 hours
1 hour
Analyzed class activity feedback and reflected on the notes I took during the activity
2 hours
Used my own analysis and the class feedback from my activity to add to my AR Visualization Comparisons wiki page
I created tables that compiled the survey results and documented a comparison of features
3/11/25 - 3.5 hours
1.5 hours
Reviewing Wiki pages created, updating the pages for clarity, and make sure my final data gatherings are represented in my comparison charts
2 hours
Created my final presentation
Went back through my implementations and took photos and videos to be added to the presentation for added media
Reflected on the work I did for this project to understand the challenges I went through and my final takeaways
Added my project 2 ideas to my presentation
Sent presentation to Melvin and David
3/12/25 - 3 hours
1.5 hours
Reviewed all project materials, making sure Wiki pages are completed, project materials are linked where needed, journal is updated, and activity has all of my content from class activities
1.5 hours
Practiced my presentation - made sure what I have in my presentation is accurately reflected in my Wiki pages
Completed self-evaluation and sent to David and Melvin
Used to self-evaluation to think further about project 2 ideas, especially the AR Interaction Toolkit idea I am still brainstorming
3/16/25 - 4.5 hours
2 hours
Read the Seven Scenarios paper, gained an understanding of the seven scenarios, thought on how I could have incorporated this method into my project 1, and considered how the scenarios will work best for my project 2 design
I am still unsure on my project 2 but I have two, quite different, ideas where one focuses more on user actions in AR while the other focuses on large data and I am trying to figure out the best way to incorporate AR
The first idea is an AR interaction tool kit (considering this a more enhanced version of my project 1) where users have a menu of actions they can do from grabbing, scaling, pointing, tapping, throwing a data model in AR to better understand it (potentially giving the model the realistic weight so when thrown, this is better understood)
And for this idea I could either use existing 3D models like I did for my project 1 but I am also considering other data I might have to build a model for, but cautious of project timeline
For my second idea, I am very interested in bird migration data and for this idea, I would represent the migration patterns of 1 - 3 bird species in AR
When looking at the 2D online maps, they were clear to see the migration patterns but I wish I had a better idea from the visualization of how many birds were traveling at once
I think using AR would lead to being able to see more depth and density in visualization while being able to place, potentially a map, in front of you to immerse in the data
2.5 hours
I am still deciding on which idea to move forward with but as I am trying to more fully develop my second idea, I had it in mind when reading the seven scenarios paper and the below project plan:
Overall idea: Using bird migration data (linked above), create a visualization in AR to show the migration of 1 to 3 bird species over a calendar year. The visualization would include a map you could place in front of you within AR (or if I use Unity, I would like to try out the window building block which places a visualization window in your scene using passthrough which could be effective). I want to show how a bird species migrates over the four seasons and if viewing in AR helps detect any different patterns not seen in 2D.
When reading the seven scenarios papers, these were my takeaways and questions corresponding with my project idea:
Understanding Environments and Work Practices (UWP)
It would be important for me to understand who would this tool is built for, what do they need, what kind of visualizations aid in understanding this data better, what existing resources are there that are widely used? These questions will help me contribute to the wiki a study on understanding the material before picking software.
Evaluating visual data analysis and reasoning (VDAR)
This scenario would likely lead to a software comparison for the wiki where I look into visualization tools and analyze how they support my data and planned tasks. By understanding the tools, then I can move forward with knowledge of my capabilities.
Evaluating communication through visualization (CTV)
When visualizing bird migration patterns, it is important to understand if the method of visualization is effective and what is being communicated to the user. It is also important to understand how this varies between a 2D web version and with AR.
Evaluating collaborative data analysis (CDA)
Even if what I create is not explicitly collaborative, I need to ask myself if the visualization be used as a tool for collaborative analysis? With the goal of a “joint conclusion or discovery”, how can the visualization aid to reach this goal?
Evaluating user performance (UP)
This scenario has me think about my project 1 where I considered the different features and how they lead to different user performances from spatial anchors to grab and scale
In project 2, I could have some features for one prototype and different features for another prototype - by comparing the user performance results, I would be able to get a better understanding of the features
Evaluating user experience (UE)
This scenario is one of the goals I had for my project 1 in-class activity. Through written surveys and verbal feedback during the discussion, I was able to gain the class’ feedback on their experience and include it in my AR visualization comparisons for the wiki.
In project 2, this scenario would help in gathering people’s feedback on expectations and overall experience which would help inform a wiki contribution that includes usability testing results.
Evaluating visualization algorithms (VA)
This does not fit directly into my project idea.
Project plan 2 with both ideas and once I pick an idea on Tuesday, I will solidify the milestones
3/17/25 - 1.5 hours
30 min
Reviewed my project 1 presentation to prepare for class
1 hour
Brainstormed for project 2 which included looking further into bird migration data
Considered how I could use bird migration data to create an effective AR visualization
3/18/25 - 3 hours
3 hours
Created my project 2 proposal presentation and sent to Melvin and David (presentation linked in project section of journal)
Developed project plan with proposed wiki pages and deliverables
Planned out goals for in-class activity
Evaluated my project plan with the below rubric:
Clearly identifies deliverable additions to our VR Software Wiki
Score: 5
how/why? Identified multiple wiki contributions that focus on AR implementations, general overviews, tutorials, and analysis
Involves passthrough or “augmented” in VR
Score: 5
how/why? The two visualizations I plan to implement will be in AR and AR will be essential for the user interactions
Involves large scientific data visualization along the lines of the "Scientific Data" wiki page and identifies the specific data type and software that it will use
Score: 3
how/why? Identified bird migration data sets that include migration location (longitude and latitude) but still need to confirm exact bird species to focus on and how well these data sets work with my software (planning on Unity)
Has a realistic schedule with explicit and measurable milestones at least each week and mostly every class
Score: 4
how/why? I have my milestones planned out but I am cautious that I may not be able to have time for developing both visualizations if one of them is particularly time consuming. I hope my previous experience with Unity will help with speed.
Explicitly evaluates VR software, preferably in comparison to related software
Score: 3
how/why? I am planning on using Unity but in my initial research, I will be looking at other softwares for map visualization implementation. If I find substantial information here, I will be able to produce comparison results. I will have a comparison between my two visualizations but they are going to use the same software (as of now).
Includes an in-class activity, which can be formative (early in the project) or evaluative (later in the project)
Score: 5
how/why? In-class activity is planned with underlying questions outlined.
Has resources available with sufficient documentation
Score: 3
how/why? I have identified multiple data sets so I will need to finalize this selection. I have existing Unity documentation but I still need to confirm what I would like to do is possible in Unity which will require further documentation.