Spring 2023, Dave Song
<Abstract>
The focus of this study was to examine interfaces that allow users to move between AR and VR for “mixed-collaboration”. The participants were asked to perform a navigational task which was evaluated after. Second experiment focused more on the interaction and visualization space in the mixed collaboration.
→ with the goal of providing results that will foster “better understanding of how to design interfaces for multi-space and transitional collaboration”
<Introduction>
The Magic Book project introduced the idea of “transitional augmented reality interface”. Based on the project, the device can provide AR exocentric viewpoint as well as egocentric view. One of the examples mentioned in the section was two researchers using AR viewpoint to view a molecular structure pre-imposed in the physical environment. Both researchers can also immerse themselves into the structure by changing it into VR.
The concept interest survey and feedback of this transitional concept was done. However, quantitative evaluation of this concept still needed to be done. Hence, this paper tried to evaluate collaborative potential of this transitional XR concept. The authors of the work refer to this process as "a mixed-space collaboration".
<Personal Goal>
I was interested in learning and exploring more about the mixed reality collaboration systems since there has been a lot of discrepancies between the real world and the vr environment when it comes to collaboration. Personal ideation and thoughts were pointing towards a combination of AR and VR in order to solve the lack of “realism” of these VR systems while still allowing the viewers to see the projected 3d renderings and models.
<Key Terms/ideas from related work>
Users’ Viewpoint: frame of reference in 3d applications.
VR: value of body-user oriented navigation map for a wayfinding task
exocentric and egocentric control frame of fereence
egocentric control frame: superior performance and an egocentric viewpoint seems to offer a better spatial representation.However, let efficient for task completion time
Impact of Frame of reference(FOR) in scientific learning
<Research Conceptual Model>
focus: mixed collaborative setup.
several points to examine implications of collaboration between different spaces presenting results of the paper so large number of applications can use them to further develop VR/AR education,training, and collaboration frameworks.
Transitional Interface and its concentual model
Four Areas of the Transitional Space
Actual(Real)
Actual-Virtual continuum
AR
Virtual
5 Main collaboration Types:
Standard Collaboration: Collaboration in one environment
Multi-perspective Collaboration: involving multiple view points
Multi-space Collaboration: Simultaneous transition of the users
Mixed-space collaboration: can work separaetly in different environment.
Transitional Collaoration: general case. A user can independently swtich beteen environments.
→ focus type: Mixed-space collaboration for this paper
goal of evaluating whether the combination of AR and VR viewpoints can be an efficient way to collaborate for a specific task.
= general comparison was done against VR only scenario
<Evaluation>
performances measures: task outcome
subjective measures: user opinion
process measures
<Experiment Design>
Task 1: way finding task in a maze where VR immersed participants were vocally guided by another participant groups
Task 1 Groups:
No collaboration: performing the task alone. Immersed user
VR collaboration. VR Ego user guided by VR exocentric user
Mixed-space collaboration: two users collaborating using different spaces. first user: VR Egocentric view. Second user: AR exocentric viewpoint of the scene.
Task2: Similar to the first task. However, the exocentric user needs to guide the egocentric user to four specific locations in the maze before exisiting.
was to require more 3d viewing of the maze.
<Metrics>
performance measure: time of completion
length of path traveled by the user
Path error: passing the same location multiple times
head movement and velocity of the user
analyzed the level of communication through audio, recorded video analysis.
<Subjective evaluation>
survey questions
awareness
ease of communication and task realization
<Key Findings>
High interest in hand held displays during the co-located collaborations
Type and dimension of the task influences the choice of collaborative environment to have to maximize the efficiency without 3d spatial navigation or manipulation
standard VR with desktop = efficient
potential use of AR for co-located collaboration to increase the interaction with physical object and information sources
user awareness, standard VR avatar representation “has been sufficient to support the mixed-collaboration”
Mixed-space or multi-perspective collaboration does not significantly affect the efficiency of task performance compared to standard VR collaboration (However, the VR collaboration was designed in a way that presents similar information visualization as exocentric view. They both had exocentric view of the maze just in VR and AR).
<Survey Questionnaire>
based on a likert scale with 3 main parts: collaboration, task and awareness.
I felt the collaboration session overall went great. We had no problems and did not struggle to complete the task.
I could easily understand my partner’s ideas.
I could easily communicate my ideas.
The task was easy to complete.
The task required little effort.
I did not have to concentrate very hard to do the task.
How often did you know where your partner was located?
During the trial, how often did you know what your partner could see?
During the trial, how often did you know what your partner was directly looking at?
<KeyTakeaways >
Notion of Transitional Collaboration and Experiment model to evaluate the performance of the proposed collaboration model
Different collaboration types (interesting to see how they explored different types of collaboration depending on different combinations of the systems they are using)
Evaluation process: task completion time, error rate, video and audio recording to track head movement and others as well as subjective survey.
AR and VR collaboration and the notion of “mixed-collaboration”
especially with new Quest Pro which provides higher quality passthrough, I wonder how different experiments and research can be done with it (Especially since the paper points out that the quality and efficiency of the transitional mixed collaboration depends on the quality of the HMD).