1 | 5 | Goal 1: articulate VR visualization software tool goals, requirements, and capabilities;
1 | 4 | Goal 2: construct meaningful evaluation strategies for software libraries, frameworks, and applications; strategies include surveys, interviews, comparative use, case studies, and web research;
1 | 5 | Goal 3: execute tool evaluation strategies;
0 | 3 | Goal 4: build visualization software packages;
1 | 5 | Goal 5: comparatively analyze software tools based on evaluation;
1 | 4 | Goal 6: be familiar with a number of VR software tools and hardware;
1 | 5 | Goal 7: think critically about software;
1 | 4 | Goal 8: communicate ideas more clearly;
0 | 4 | Goal 9: demonstrate VR tools to non-users;
Project 1 Proposal <Project 1 Plan>
Presentation for Project 1 Proposal <Project 1 Proposal Presentation>
In-class Activity <2D and 3D network>
End Presentation for Project 1 <Project 1 Final Presentation: Visualizing Community Network Graphs in 2D vs. 3D>
Project 2 Proposal <ADD LINK>
Presentation for Project 2 Proposal <ADD LINK>
Poster <ADD LINK>
In-class Activity <ADD LINK>
Public Demo <ADD LINK>
ALL OTHER WIKI CONTRIBUTIONS
CONTRIBUTION 1 [2023 Updates on Twitter API] <Twitter API Update Contribution>
CONTRIBUTION 2 [Redirected all the children pages under Student research page] <Student Research>
CONTRIBUTION 3 [wrote a tutorial/introduction page on 2D network visualization using Gephi] <2D Network Visualization>
CONTRIBUTION 4 [Wrote an informative wiki page on community network visualization in Unity] <Network Visualization in VR>
CONTRIBUTION 5[Wrote a project 1 report reviewing in-class activity feedback/VR application evaluation] <Project 1 Final Report & Software evaluation>
CONTRIBUTION N [short description] <ADD LINK>
Total: 74 hours
1/29/23 - 4 Hours
Set up student journal, joined course slack, reviewed and explored the wiki as well as course homepage
9 Changes to Wiki
Add recent VR hardware devices to the list of VR hardware on the wiki homepage
On the VR development software page, there are #todo items that have not been executed.
Since there is a dedicated page for VR development software in depth, the “VR development software” page under home> "I want to decide which VR tools to use for a specific user" can have a proper page title to avoid confusion regarding page hierarchy. [Completed] Changed the page title
1 hour to complete
Under I want to decide which VR tools to use for a specific use case, Unreal Engine 4 is empty.
Currently, if you access the VR Development Software comparison page, users only get access to the parts within the page about each software. It would be a better user workflow to also have links to each software page.
On WebVR: Overview page, Wiki displays links to WebVR software. Some links lead to the software homepages while some lead to our wiki's review/analysis page. We can add links to Wiki pages as well as company websites to fix such inconsistencies in the links on the page.
10 hours to complete
There is a page for Low budget VR headsets. However, it would be great to have a page dedicated to comparing low to high-budget VR headsets. It would be ideal to have each headset include a link back to pages already existing about each headset. This would take a while since doing comparisons of currently available headsets can take more than a couple of hours
It would be great to have a page on Meta’s Quest Pro under VR hardware. As the functionality of the quest pro increases, users can perform a variety of tasks. Adding critical research and potential usage of them on the wiki.
Under the applications of the VR page, VR in Design/Art can be added. Immersive contemporary art grows more as VR and AR technologies become readily available to the public. It would be a great addition to the list of applications on our wiki.
Read an article on Meta Quest Pro's mixed reality system
2/1/23 - 2 Hours
Followed Quest 2 Setup Guide, Paperspace setup
Explored Oculus and VR earth
Attended Jakobi's office hour to ask questions regarding Virtual Desktop
2/2/23 - 1.5 hours
Pick 2-5 potential pieces of software to explore and evaluate for your research project
Unreal engine, Unity3D,
Potential Project Ideas
Collaborative 3D architecture model visualization&editing (Structure, environment)
Visualizing Twitter Community Detection results with different factors(that multiple users can view)
Immersive tour of our solar system
Blue whale experience VR – their anatomy and behavior analysis
2/5/23 - 3hrs
Finished in-class Google Earth VR Activities and Objectives (for both VR and Web) + Google Form.
Learned how to format using Google Site for journals
screenshots and recording of three landmarks(via VR): 1st - home, 2nd: Metcalf Hall, 3rd: video from Metcalf to Bethel, Maine
Finished downloading DinoVR on the Paperspace machine. Shared the experience with colleagues in CIT for 7-10 min
2/6/23 - 3hrs
Read some background papers
Read "The History of Digital Realities" , "VR in pop culture" pages on the wiki
Read "TiltBrush, Virtual Music, and Spotify for Audio Visualization in AR" page on the wiki
Tilt Brush: paint feature with audio reactive features
Spotify Web API
Spotify's audio analysis data -> audio manipulation feature
Looked at potential collaborative VR software on the wiki
Revisited Previous Student Projects on the wiki
Immersion Analytics, Class Activity(Paired activities. Experience, evaluate, and discuss)
Revisited Manhattan Traffic Data in Unity
Used manhattan traffic data in Unity,
FIESTA: A Free Roaming Collaborative Immersive Analytics System: for Team-Based analysis
Free roaming immersive environment to support team-based analysis
Developed in Unity3D
Key: users to author/generate visualizations and place them in space around them
Visualization generated using Immersive Analytics Toolkit (IATK)
IATK supports 3D visualization. However, the authors moved it back to 2D for efficiency (learning time and confusion)
Shared Surfaces and Spaces: Collaborative Data Visualization in a co-located immersive environment
More on the FIESTA system
How users "collaborate when given complete freedom to organize their workspace"
Related Work key points
Physical environment: three types of territories were implicitly created
personal, shared, and storage territories
Users would arrange the contents and analytics egocentrically in body-fixed like fashion while working
Linear or circular layout
10 groups of 3 participants. 2 groups ( A: 2D visualization B: virtual table, 3D visualizations )
Presence of 3D visualization
participants use egocentric layouts to organize the data visualizations
2D and 3D -> difference in user workflow and organization
VR collaborative tabletop metaphor: participants saw no benefit in using the table in an environment where visualization can be placed anywhere.
2/6/23 - 3 hrs
HW due 2/7
MRI Imaging Data
Twitter Community Detection Data
3~5 Project Idea
Utilizing Covid Travel Data From the Wiki, the project evaluates software and UX comparison of VR-Viz and other data visualization software.
3 things I'll do during the project
besides Given Covid Dataset, work with the Covid case number trend and number
Visualize given covid travel data in VR with a map
Make the viewing and analyzing experience collaborative (2+ users)
Generate 2D or web data plots as well as VR demo(sample data) to compare and evaluate the difference.
What defines a good type of dataset for VR/immersive data visualization?
Add the international Covid Travel dataset to the dataset section
Find similar dataset working with other
Comparing and Evaluating Collaborative 3D architecture modeling(structure, interior) visualization software. (visi draft, unreal engine, arki, and potentially gravity sketch as well)
3 things I'll do during the project
Use and evaluate at least 2 software and review how well each can represent spacial elements
Review and read VR user experience research to evaluate egocentric data/workflow as well as the collaborative aspect
Review how these software(supporting more refined visualizations as well as UI experience) could potentially be used to better provide collaborative data visualization space for scientific research purposes
Students get paired up in a group of two. After seeing a provided sample shape, each group try to make the same/similar shape using different software options.
Software Analysis and comparisons for VR architecture under VR for architecture section
Generating a 3D visualization of Twitter community detection results depending on selected communities. (Not sure about datasets. )
3 things I'll do during the project
Get or make a community detection algorithm-produced dataset (had a link to a public project but the dataset link seems to be invalid)
Generate a 3D version of the result(if possible, for different versions depending on different topics in interest)
Make the dataset viewing and analyzing experience collaborative with Photon engine (research needed for details)
Using a simple collaborative workspace, grouping and creative Twitter communities collaboratively (this can be as simple as 30 snapshots potentially evaluating which visualization software suits the best)
Addition to Twitter API/ dataset section of the WIKI. as well as a page on Twitter community detection algorithms
(Potential project idea: VR version of NASA's Eyes on earth project) using one of
Software Evaluation Metrics
How intuitive is it to learn and use?
How easy it is to collaborate with another user (editing the same virtual object, communication, space)
How easy it is to learn the software's advanced features online?
It is compatible with several operating systems and devices?
How was the quality of the rendering of the space(frames per sec, color saturation, user feedback, etc.)
2/8/23 - 3.5 hrs
Finished DinoVR activities and survey
Tried DinoVR with different features enabled
Watched a course video on Community detection
Girvan-Newman Algorithm, edge betweenness values, high edge betweenness means high possibility of holding edges together.
how does information flow through the network? Social network. How do people find out about new jobs? : Mark Grannovetter's work in 1960s.
Contacts were often acquaintances rather than closer friends
Two perspectives on friendships: 1. structural -- spanning different parts of the network 2. Interpersonal -- can be strong or weak
Strength and structure of the friendships. Information flow: long-range edges allow you to gather information from different parts of the network and get a job. -> access from different communities
Social network evolve to naturally form communities -> triadic closure. = high clustering coefficient
Empirical study by Bearman and Moody: Teenage girls with low clustering coefficient are more likely to contemplate suicide
proves the natural tendency and dependency of a person to be deeply embedded in the network/clusters
However, Granovetter's theory was not tested. -> 2007 this was evaluated based on 20% of EU population
2/11/23 - 4hrs
read articles on Community detection algorithms and networks
Watched tutorials on Gephi and Force Atlas 3d for 3d visualization of the network
Force Directed Graph
Gephi's clustering algorithms for Modularity seem to be the most widely used algorithm for the community detection for social media accounts and similar datasets
Found a step-by-step tutorial on Sample datasets(2)
Have not found a way to get the resulting file from Gephi to Unity Yet.
Phonton seems to be the most reliable collaboration tool.
2/12/23 - 3.5hrs
Was stuck finding ways to make Gephi graphs into 3D and turn that into VR experience in Unity
Jakobi redirected me to check out the force-directed approach which is another way of displaying closeness among nodes
Unity has a physics engine as well as tools to develop force-directed graph representation in 3D
Downloaded unity hub
Found some other Github repo for 3D force-directed graphs
Changed from Twitter social network community detection to general community detection
Twitter just shut down its public API (now for full access, we need to pay?)
Other datasets were also available
Focusing on compare/contrast between Gephi's 2D community detection software vs. Unity Force-directed graph in VR
In-class activity: compare community visualization produced by Gephi(2D) vs. force-directed graph of the network(VR)
Which method visualized the separation of communities(or clusters) better and why do you think that method did a better job visualizing the individual group of nodes?
What are the differences between the two methods?
Which method allowed better visualization and understanding of the links between groups or nodes? What aspect of the method do you think enabled such visualization?
Wiki Contributions/deliverables: Tutorials on Gephi and Force Directed Graphs
How to use Gephi and upload data(node and edge files) to develop a community detection network.
Introduction to Modularity and 4 main algorithms in Gephi
This page will go under the Related Technologies Section of the wiki
Tutorial and background summary: what is a force-directed graph and how to make one using Unity.
Users(2) can touch each node and apply force to the graph
Users(2) can be in a synchronized 3D virtual space in which they can interact with, discuss, and analyze patterns of rendered community networks.
Worked on 3 min presentation
2/13/2023 - 3 hrs
Found wiki tutorial for Unity Photon
HW for 2/14
Evaluation of Project Plan
Here is an evaluation rubric for projects:
o The proposed project clearly identifies deliverable additions to our VR Software Wiki:4
o The proposed project involves collaboration in VR:4
o The proposed project involves large scientific data visualization along the lines of the "Scientific Data" wiki page and identifies the specific data type and software that it will use:5
o The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class:3
o The proposed project explicitly evaluates VR software, preferably in comparison to related software:4
o The proposed project includes an in-class activity, which can be formative (early in the project) or evaluative (later in the project):3
o The proposed project has resources available with sufficient documentation:4
First Milestone for Feb. 14th
Generate one small network graph and community network using Gephi
Followed this tutorial
An example dataset is ppl sending mails in the EU
Data: 1000 nodes, 14,116 edges (1-mode, directed)
2/17/2023 - 4.5 hours
Lost the original Gephi file for some reason
Found some bugs regarding workspace and data management for Mac
Reinstalled Gephi and recreated the project.
Recreated the community detection from dataset one(1000 nodes and 14,116 edges)
Also explored Gephi functionalities. Created a 2 mode nework from a dataset containing 110 nodes and 142 edges
2/22/2023 - 2.5 hrs
Completed installation steps for both Paraview and Ben's Volume Rendering applications
Played with different functionalities of Paraview(section, contour, color edits, etc.)
2/23/2023 - 2hr
Read articles and documentation for Twitter API v2
Researched claimed 'Pros' of the new API v2 with fees
Added a summary of the 3 main access levels On the Science Data page
2/26/2023 - 5hr
worked on unity and unity tutorial to create force directed graphs
A few tutorials I followed caused errors with the script I was running.
Followed this series of Youtube tutorials for setting up the vr app setting.
Debugging and trouble shooting
created a small example force directed graph with 5 example nodes
2/27/2023 - 2.5 hrs
worked on project1 update presentation for Tuesday.
Found a professional VR network visualization tool Aviar Graph
Reached out the CEO and got a permission for one-off software demo
Also researched about the current state of net work visualization software and technologies (both VR and desktop) and found Neo4j graph database which Aviar Graph utilizes in order to generate it's 3D visualization of networks.
3/1/2023 - 1 hrs
finished Melvin's in-class activity
class in class activities prep
had some issues with sidequest working with my personal computer.
3/3/2023 - 4hrs
Debugging the graph problem. At the moment, the graph get's generated. However, it disappears into the void.
Tried to work with the collider as well as mass and gravity setting for the node prefab.
3/5/2023 -9 hrs
Found the bug. It was happening due to the fact that the generated graph when I start the application (from the script) collides with the plane in the scene, causing the nodes and the graph to "bounce" off and disappear at high speed.
worked with gravity and graph script setting.
Larger dataset was getting buggy. For the in-class activity on 3/7, decided to work with dataset from Les Misérables.
Generated Gephi comparison graph on the same dataset
Generated 3d VR force directed graph
worked on the in-class handout
Generated step by step tutorials for generating a bigger net work community/cluster graphs using gephi
3/6/2023 - 2hrs
worked on the in-class activities handout.
Tried out HackMD for the first time (originally tried to make a handout with a google doc. However, I saw hackMD handout and decided to learn how to use it and make one for my in-class activity).
My in-class activity handout can be found here.
The dataset and the apk file I used for the in-class activity can be found here.
3/8/2023 -2.5 hrs
tried generating a larger graph with new setting and it seems like the graph is a bit more stable.
However, the force directed graph is always moving in contrast to the Gephi algorithm which finds nodes stable positions
Reflection on the VR software with bigger graphs
I have been trying to implement scaling the environment(mainly the graph) using the two congrollers (similar to what we did during the in-class activities on other vr software). However, some of the script I found on the internet have been not working.
One of the example I found was this script.
3/11/2023 - 2.5hrs
Worked on deliverables - Gephi Wiki page as well as Unity Page
Tried to make the nodes responsive to the number of nodes connected. Encountered some errors. making progress.
3/15/2023 - 2hrs
worked on the Project 1 Reflection wiki page content
made project 1 presentation
Wrote self evaluation of my progress
3/19, 20 / 2023 - 2hrs
fixed some visualizations for FDG network visualization. Played with gravity and mass since I could not prevent it from flying away.
Enabling gravity and light mass ended up working well
worked on revised self-evaluation
3/20/2023 - 4hrs
Updated Project Final Report page
Read HW paper
worked on project 2 ideation
3/21/2023 - 2hrs
Understanding environments and work practices (UWP)
Understanding the work, analysis, or information processing practices by a given group of people with or without software in use.
Evaluating visual data analysis and reasoning (VDAR)
assess a visualization tool’s ability to support visual analysis and reasoning about data.
Evaluating communication through visualization (CTV)
the goal or purpose to convey a message to one or more persons, in contrast to targeting focused data exploration or discovery. Their effectiveness is usually measured in terms of how effectively such a message is delivered and acquired.
Evaluating collaborative data analysis (CDA)
Evaluations in this group study how an information visualization tool supports collaborative analysis and/or collaborative decision-making processes. Collaborative sys- tems should support both taskwork, the actions required to complete the task, and teamwork, the actions required to complete the task as a group
Evaluating user performance (UP)
Evaluations in the UP group study if and how specific features affect objectively measurable user performance.
Evaluating user experience (UE)
Evaluation of user experience seeks to understand how people react to a visualization either in a short or a long time span.
Evaluating visualization algorithms (VA)
Evaluations in the VA group study the performance and quality of visualization algorithms by judging the generated output quantitatively. A visualization algorithm is broadly defined as a procedure that optimizes the visual display of information according to a given visualization goal.
Started Ideation for second/ final project
Evaluating Collaborative workspace for architecture 3d space modeling
Seven Scenarios in terms of Project 2 idea 1
What is the context of use of visualizations?
3D rendering of an architecture is sometimes not enough to give a sense of scale as well as perspectives.
How does it support processes aimed at seeking infromation, searcing, filtering, and reading and extracting information?
The VR visualization of the architecture project enables extracting spacial information easier with realistic analysis of light, reflection, as well as potential material choices.
Do people learn better and/or faster using the visualization tool?
Having a VR space where people/stakeholders can experience the space instead of 3d rendering/floor plans will make the process of understanding the space faster.
Does the tool support effective and efficient collaborative data analysis?
If some of the existing softwares provide mutiplayer feature, yes.
How does one visualization or interaction technique compare to another as measured by human performance?
Comparison of overall structure understanding using floor plan vs. the vr experience can be done
Is the tool understandable and can it be learned?
intuitive architecture rendering VR software should be able to provide an intuitive way to nagivate through the space.
Project 2 idea 2
Visualizing public transportation volume before after lock down (major cities).
More research needed.
3/22/2023 - 3.5hr
Improved Wiki contribution -- fixed links, page locations, page title. Fixed based on feedback from midterm review.
Worked on Ideation for Project 2. Looked at serveral project pages as well as Online resources for potential APIs, softwares, and others.
Project 2 Idea: Collaborative VR Workspace User Experience Analysis
Project Name: Collaborative VR Workspace User Experience Analysis
Motivation: Although there are pages on Collaborative VR software and Collaborative workspaces in the wiki, there is no analysis of VR collaboration user experience, nor is there any comparison of collaboration using the Seven Scenario Analysis.
Project Description: Evaluate the current state of Collaborative VR spaces for science and other applications. Potentially conduct in-depth research on collaborative communication and interaction experiences in VR space and propose new techniques to improve the collaborative VR experience.
Similar format to Aakansha Mathur's page.
Collaboration User Experience Wiki Page
Analysis of VR collaboration and factors that influence the quality of collaborative workflows.
Collaborative VR Visualization Page
Collection of tutorials for developing collaborative VR projects.
Collaborative VR App and Software Hub Page
Analysis of existing pages and reorganization to highlight collaboration in VR.
Collaborative User Experience Analysis for Art/Other Areas
Additional Section for VR Collaboration Course Homepage.
April 4: Reorganize previous collaborative VR projects and learn how they implemented collaborative aspects of their projects. Read research papers on collaboration in VR and the factors that affect their quality. Apply Seven Scenarios to the projects.
April 6: Read at least two research papers on collaboration in VR and research collaborative environment applications in VR.
April 11: Buffer and analyze collaborations in scientific data visualization.
April 13: Work on the Collaborative VR Visualization Page and scientific visualization software.
April 18: Buffer and compare and analyze. Finish the Collaborative VR Visualization Page and Collaborative VR App/Software Hub Page.
April 20: Start working on the Collaboration User Experience Analysis.
April 25: Incorporate Quest Pro experience if possible, and research VR vs. AR.
April 27: Buffer and finish setting up analysis metrics and user experience research.
May 2: Try out and analyze collaborative software for art and workspaces.
May 4: Finish up the wiki contributions and finalize the analysis report.