Title: Visualizing CLARITY and Diffusion MRI data
Activities:
Download and process CLARITY image dataset.
Find new diffusion MRI datasets.
Add information about CLARITY and diffusion MRI datasets to the wiki.
Add and compare software for visualizing these formats.
Learn how to convert CLARITY imaging data to volume render.
Install Unity and download necessary software to support VR visualization in Unity.
Convert diffusion MRI data to volume renders and visualize in HTC Vive.
Port CLARITY imaging renders to the YURT?
Comparison between Paraview, MinVR, and VTK.
Milestones:
2/12: Convert CLARITY imaging data to a usable format and render data in Paraview.
2/14: Add volume renders to Unity and build a steam VR app to visualize data in HTC Vive.
2/21: Finish VR visualization and get user feedback from fellow classmates.
Deliverables:
New wiki entries on CLARITY imaging and diffusion MRI imaging.
An app which displays a high resolution volume rendering of CLARITY image data.
A survey on people's reactions to CLARITY imaging.
Comparisons of different neuroscience imaging software, in terms of speed and ease of use.
Title: Visualizing Ecology LiDAR data in Virtual Reality and YURTs
Activities:
Examine Kellner lab and determine possible visualization methods for data.
Compare and contrast various LiDAR data processing tools and libraries; in terms of usability, installation time, and rendering quality.
Explore different mesh rendering techniques for visualizing ecology data.
Document process of rendering LiDAR data in VR and add a possible tutorial to the wiki.
Explore methods for mapping LiDAR data into a continuous visualization; in other words, determine how to stitch LiDAR data into a cohesive visualization.
Create a tutorial for MinVR and DinoYURT.
Milestones:
2/12/2019, Research Kellner Lab.
2/14/2019, Convert LAS files to readable format for DinoYURT.
2/21/2019, Finalize data conversion process, if necessary, and begin HTC Vive Visualization.
2/26/2019, Finish HTC Vive Visualization.
2/28/2019, Visualize preliminary ecology data in YURT.
3/5/2019, Explore new LiDAR ecology samples from dataset.
3/7/2019, Render new ecology samples in YURT.
3/12/2019, Research algorithms for stitching ecology samples together.
3/14/2019, Continue exploration of stitching together.
3/19/2019, Implement a simple algorithm for stitching N ecology samples.
3/21/2019, Render stitched images in the YURT.
4/2/2019, Publish comparison between MinVR and Paraview for rendering LiDAR data.
Deliverables:
A VR visualization of LiDAR ecology data using the HTC Vive.
Comparisons of various LiDAR visualization and processing software.
Tutorials on MinVR and developing software for the YURT.
In-Class Activities:
Exploring volumetric rendering with LiDAR data.
Develop a short activity where we explore techniques for rendering and visualizing LiDAR data. We would most likely use Unity to visualize this data.
Title: Visualizing Ecological LiDAR data in Virtual Reality and YURTs
Objective: Document and explore how MinVR and Paraview can be used to visualize composed ecological LiDAR data.
Remark: A composed LiDAR model is several .las files stitched together based on a metric (e.g. GPS Time).
Questions:
How can LiDAR data be meaningfully composed? In other words, what metrics can we use to perform this task?
What software is available for visualizing LiDAR files?
What are the distinctions between MinVR and Paraview for visualizing LiDAR data in VR / YURTs?
What scientific insights can be derived from visualizing ecology data in VR / YURTs?
Activities:
Examine Kellner lab and determine possible visualization methods for data.
Compare and contrast various LiDAR data processing tools and libraries; in terms of usability, installation time, and rendering quality.
Explore different mesh rendering techniques for visualizing ecology data.
Document process of rendering LiDAR data in VR and add a possible tutorial to the wiki.
Explore methods for mapping LiDAR data into a continuous visualization; in other words, determine how to stitch LiDAR data into a cohesive visualization.
Add pixel-wise coloring to LiDAR data based on point height and other metrics.
Determine correct normalization factors for ecology data.
Learn more about LiDAR data and in order to motivate composing multiple LiDAR files.
Read MinVR documentation and review OpenGL.
Install Paraview LiDAR plugin.
Install MinVR on graphics lab computer.
Read papers on LiDAR stitching.
Create support code for in-class activity.
Download new LiDAR samples from repository.
Create a tutorial on utilizing Laspy for .las data processing.
Milestones:
2/12/2019, Research Kellner Lab.
2/14/2019, Convert LAS files to readable format for DinoYURT.
2/21/2019, Finalize data conversion process, if necessary, and begin YURT Visualization of the first three LiDAR models.
2/26/2019, Finish YURT visualization for first three LiDAR models.
2/28/2019, Download and visualize a new batch of LiDAR ecology models (at least N=5).
3/5/2019, Visualize all of the previous LiDAR models in VR (HTC Vive).
3/7/2019, Create a tutorial for visualizing .out files in the YURT using MinVR.
3/12/2019, Create a composite LiDAR model, composed of three separate .las files, and visualize it in the YURT.
3/14/2019, Create a tutorial on composing/stitching .las files and visualizing the results using Paraview / MinVR
3/19/2019, Create a script for composing N .las files based on GPS time.
3/21/2019, Compose N > 10 .las files and visualize the results in the YURT.
4/2/2019, Publish in-depth comparison between MinVR and Paraview for rendering and visualizing LiDAR data in the YURT .
Deliverables:
A VR visualization of LiDAR ecology data using the HTC Vive and tutorials to accompany this process.
Comparisons of Paraview, Lidarview, and MinVR for visualizing and rendering LiDAR data.
Scripts / tutorials on composing .las files arbitrarily and based on GPS time.
A YURT visualization of composed LiDAR data.
In-Class Activities:
.las file conversion and visualization:
Everyone downloads and converts one .las file in the repository to a .out file, then we will display their results in the YURT.
This will visualize novel .las files, test the rigidity of my data conversion pipeline, and provide a comparison between Paraview and MinVR for visualizing LiDAR data.
This might be an interesting collaboration opportunity with Ronald; he could provide a Paraview tutorial, and I would provide a MinVR tutorial.