Project 2 Proposal

Shreya D'Souza

1) Point cloud data

  • Using Open3D to look at ecology data from the Ecology and Evolutionary Biology department


Deliverables:

    • Open3D tutorial

    • How to interpret point cloud data

    • Working with las files

YURT Functionality

    • Porting from Open3D to the YURT

    • Head-tracking would be ideal given the data type

Data

    • EEB Data from the Data, Examples and Collaborators Page

      • The more info page does not show anything, so I need to find out what exactly this data is depicting

Class activity

    • Using Open3D, or using LIDAR point cloud viewer as shown above

  • Schedule (will develop more)

    • Week 11:

      • Work on installing Open3D ( was running into issues)

    • Week 12:

      • Contact collaborators to find out more about data, work on viewing data in Open 3D

    • Week 13;

      • Work on class activity

      • Viewing on Open 3D should be completed, start working on porting to the YURT

    • Week 14:

      • Porting to the YURT, work on deliverables

    • Week 15:

      • Same as above, final presentation due

Resources

    • Brown professors who sourced the data

    • Documentation already on wiki

2) Extend my first project (preferred but I'm not sure how to flesh this out)

  • I previously discussed looking at survival data of the different patients. I could use this to create a display of all the brains sorted by days of survival so that we could see (probably) how the size of the tumour is inversely correlated with its size.

  • Another thing I could try to do is annotate the models in Paraview with biological facts, patient data, segmentation data etc.


  • Deliverables: Paraview tutorial

  • YURT Functionality: Implement head tracking, get annotations to show up, look at multiple models at the same time

  • Data: BraTS dataset

  • Class activity: Paraview tutorial

  • Schedule:

(Evaluation by Loudon Cohen)

o The proposed project clearly identifies deliverable additions to our VR Software Wiki (5)

o The proposed project involves previously unavailable Yurt data visualization functionality (5)

Yes, the project would utilize Open3D to visualize ecology data in a new way or more BraTS data in Paraview

o The proposed project involves large data visualization along the lines of the "Data Types" wiki page and identifies the specific data and software that it will use (5)

See above.

o The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class (5)

Proposal one schedule looks good!

o The proposed project includes an in-class activity (5)

Already done!

o The proposed project has resources available with sufficient documentation (5)

Project 2 Evaluation by Brandon 3/30/20

o The proposed project clearly identifies deliverable additions to our VR Software Wiki - Strongly agree

o The proposed project involves previously unavailable Yurt data visualization functionality - Strongly agree

o The proposed project involves large data visualization along the lines of the "Data Types" wiki page and identifies the specific data and software that it will use - Strongly agree

o The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class - Strongly agree

o The proposed project includes an in-class activity - Strongly agree (Already completed)

o The proposed project has resources available with sufficient documentation - Strongly agree


Project 2 eval by David 3/31/20

I fear that extending your first project may not be sufficiently specific or clear to move forward on in the remaining time. On the other hand, I like it as a use of paraview and data getting more data going in the yurt. Spencer has been working on paraview+yurt, so that might help. And another student, Ella (not in the class, but on the slack), is also pushing on paraview+yurt. With three folks pushing on it (including you :-), things like the head tracking might come up to speed and make it sufficiently usable without it all being on your plate.

The lidar problem sounds interesting and perhaps more specific. I am not sure how it would get into the yurt, unless there is some way to have it export data in paraview/vtk format? Ross worked with this data last year and got it up and running in the yurt using the "DinoYurt" software. You should probably try to coordinate with him to figure out how you might best move forward, both conceptually and concretely. You could also check in with Brandon who is working on this kind of data for his PhD research. At this point I think Ross probably knows more, but Brandon might have some thoughts.

Nice work!


Review of Project 2 draft by Ross 03/31/20

I agree with David that you would need to add some additional components to your proposal if you wanted to extend your first project. Off the top of my head there are a couple of paths you could take. If you wanted to keep working with the BrATS dataset, you could try visualizing some of the machine learning aspects of the dataset; in particular, since the BrATS dataset is often used to train convolutional neural networks for brain tumor segmentation, you could try visualizing some of the convolutional filters features in these models to, possibly, gain some insight into how these models are making decisions. If you wanted to continue working with brain imaging data, you try visualizing another brain imaging dataset (e.g. fly connectomes) or another brain imaging modality. However, if do pursue this path, you would need to flesh out your proposal quite quickly.

Conversely, if you were to pursue the pointcloud visualization route, I would recommend using DinoYURT for your YURT visualizations; while Open3D is a nice library for processing and visualizing pointcloud data, modifying it to work in the YURT would be a herculean effort and likely produce results comparable to DinoYURT. In terms of working with the ecology pointcloud data, there are a couple different routes your could take. First, one major problem with visualizing pointcloud data in the YURT is that your can only visualize so many points at once (~500k points) using DinoYURT. Hence, visualizing any pointcloud typically involved subsampling. So, you could explore and implement some pointcloud subsampling techniques; this is what I worked on when I took this class. Another approach is to visualize the pointcloud data is bursts; in particular, instead of visualizing a pointcloud all at once (i.e. showing all of the points), visualize the pointcloud in the way it was captured, as bursts of laser beams that propagate through the medium its capturing. This approach is definitely harder but would be really cool!

Good work!