Syllabus
In a collaborative group effort, this course will search out, install, test, and critically evaluate VR software that supports data visualization for researchers. We will target several specific types of data, including volumetric data, and remote sensing data. We will investigate the capabilities of software for head-mounted displays (HMDs), big-metal displays like caves and the yurt, and, as a baseline, desktop displays. Software evaluation will include web research, hands-on case studies, and surveying. Results will be documented in a courses wiki.
After this course students will be able to:
Articulate software tool goals, requirements, and capabilities
Construct meaningful evaluation strategies for software libraries, frameworks, and applications; strategies include surveys, interviews, comparative use, case studies, and web research
Execute tool evaluation strategies
Comparatively analyze software tools based on evaluation
Visualize some scientific data in VR
Be familiar with a number of VR software tools.
We will begin the semester by taking stock of candidates for software and hardware. Understanding and codifying their claimed capabilities will guide the choice of a subset for closer study. Each student will pick from subset, designing a tutorial that others in the class will subsequently work through. Through both the design and execution of the tutorials, we will gather deeper knowledge of the benefits and costs of the tools. In addition to hands-on evaluation, we will collectively create a survey to circulate on the web to gather evaluative information from VR users, software developers, and hardware developers. At each stage we will document our findings and analysis in a wiki. One goal for the wiki is to help VR developers to choose wisely in creating their virtual realities. A second is to identify gaps in available software and thus to nudge the development of future software to fill those gaps. The wiki is live with results from a related 2018 class.
Evaluation in the class will be as follows:
15% class attendance and participation
10% initial search contributions to wiki
15% project design, creation, data collection, and analysis contributions
20% tutorial quality
10% tutorial results analysis
15% journal of activities and findings
15% overall final wiki contributions
Over 14 weeks students will spend 3 hours per week in class (42 hours total) and an average of 10 hours per week on homework, as described above (140 hours).