ALL OTHER WIKI CONTRIBUTIONS
CONTRIBUTION GROUP A : VR for NeRF
CONTRIBUTION GROUP B: Reading in VR
Total: 213 hours
total: 3 hours
Nine Seperate Changes
Takes 10 minutes:
1.The "HelloWorld Unity tutorial" cannot be found. The old link doesn't appear. (completed the link)
2."Into to WebVR" link on homepage does not respond when clicking
3.Maybe can circle out 1 or 2 best VR hardware and software. For normal users, we would just go and find the most popular ones to use.
Takes 1 hour:
1.Make a chart o compare the pros and cons for each software
2.Probably visualize the number of sales for each VR hardware last year. (we probably want to buy the most popular ones)
3.In the Unity pages, there is "metrics, Accessibility: The estimated time for someone to create Hello World in VR". I think it's not a good comparison. After a learning, I can create Hello World very fast, while knowing nothing about Unity....
Takes 10 hours:
1.Find the most useful packages for Unity. Some of them (as I tried Lux Water) seems not capable to use right now(?) I hope that we can suggest useful (and cheap) packages that new comers can use in Unity.
2.A list of userful shortcuts to use Unity. (based on different computers)
3.Some learning curve/ expected learning time/ learning experience introduction on Unity would be helpful
total: 6 hours
Quest 2 Setup:
Finished Quest2 setup, login to paperspace virtual machine, and played Google Earth VR (found my home back at Beijing on it)
Read Past Projects:
1.Shreya D'Souza's project on visualising brain tumour progression in repsonse to chemotherapy
2.Beatrice's project on VR tools to aid Historical Artifact Archiving
3.Paul Molnar's project on Underwater 3D Cave Exploration
Read Research Projects:
1.CoVAR: a collaborative virtual and augmented reality system for remote collaboration
1.Multi-User Framework for Collaboration and Co-Creation inVirtual Reality
Both are very popular 3D engines (for game development). However, Unreal Engine is much harder to use while producing better graphics effects. Furthermore, as I tried learning the tutorials of Unity and Unreal, I think Unity is easier for a newcomer to learn and quickly work on some projects. I would probably use Unity for my projects.
1.Masterpiece layers: many great artworks have multiple layers. I want to separate them one by one to see the effects.
2.Visualize population density change in Manhattan with respect to time.
3.Visualize the change of RGB color in 3D space. For example the original point represents RGB(0,0,0), the right upper top most point represents RGB(255,255,255).
4.Implement NeRF (Neural Radiance Fields) in VR. One user can input a set of images, while our system would synthesis and recreate the 3D model in VR.
total: 6 hours
Installed DinoVR, and played for 20 mins. (Waiting for class activity next Tuseday)
Solidify Project Ideas:
idea: Implement NeRF (Neural Radiance Fields) in VR. One user can input a set of images, while our system would synthesis and recreate the 3D model in VR, so that the other user can play with it
Things to do: 1)Implement the NeRF paper 2)Connect the NeRF output with VR 3)VR visualization of our result
Class Activity: One student can take a couple of photos of something they saw, and our system recreate it in the VR so that other students can manipulate it.
Deliverables: VR Development Software -> Xcode -> NeuralNetworks. We are demonstrating how more complicated algorithms can be used in VR and how to connected neural networks with VR.
Metrics:1)The efficency of our process 2)How detailed the reconstructed result is
Software:Unity 3D/ Python/ Tensorflow
Data: input by the user and generated (recreated) by our system.
idea:Many great artworks have multiple layers. I want to separate them one by one to see the effects.
Things to do: 1)Find dataset for the works that contain multiple layers, and find each layer's data 2)Separate the layers and create the Z-axis 3)Visualuize the layers in 3D in VR
Class Activity: Students can play around the layers of the masterworks and recreate the works. Including some (but not all) layers might result in special perspectives of the artworks.
Deliverables: It can go to Applications of VR -> VR is Art History. (we don't have this subset yet)
Metrics: 1)Does this new method make it clearly to view the artwork? 2)Can we still sense the original work?
Data:Still finding it
idea: Visualize the change of RGB color in 3D space. For example the original point represents RGB(0,0,0), the right upper top most point represents RGB(255,255,255).
Things to do: 1)Prepare the dataset with 255^255^255 values 2)Convert the values to RGB colors in VR headset 3)Allow user control on color changes
Class Activity: Students can play around the different colors and see their transitions. Furthermore, they can combine different colors and see their production (by simply adding the values)
Deliverables: VR visualization software: while we are not using any new software, we exploring how to write simple algorithms that can be visualized (and adapt to changes from user input)
Metrics:Can we clearly see the transition of colors?
Data:Can produce it myself. But need a good way to convert values into colors in VR.
（Temple of Heaven, Beijing)
Screen recording of visitng temple of heaven in Google Earth VR
Screen recording of visiting my home back at Beijing
total: 4 hours
Done playing with DinoVR.
total: 5 hours
total: 27 hours
Works on NeRF (15h)
Investigated in three NeRF training method:
a)Implementation by the original team of the NeRF paper
Among them, I choose to use Nvidia's instant NeRF for the following reasons: 1)it can train a new model in seconds, comparing to 4 hours of other models. It allows the class activity 2)It has a GUI that allows VR view
I would like to train my own data on NeRF. My input should be an video or a set of images. To this end, I explored several strategies:
a)using COLMAP python package. However, as I tried in on three different virtual machines it seems COLMAP have some compatiable issue with their system...
b)using Record3D, an ios App. This App is very easy to use. Users only need to use their phon
Works on Data Type Transformation (2h)
Explored Blender following this guide
My goal is to export NeRF generated model to blender and then transform it into readable Point Cloud data for Unity
Works on Untiy (10h)
Learnt basic Unity tutorial on creating a VR game
Learnt Unity locomotion and continuous movements following these videos.
Learnt Unity Mesh usage following this tutorial
HW 2/23 - HW 3/7
total: 20 hours
Works on NeRF (2h)
Prepare data; trained 3D objects to be visualized in unity; mesh exportation
Works on Photogrammetry (5h)
Explore Apple Photogrammetry
Works on Unity (13h)
Create an art gallery in Unity to visualize results
Unity techniques including: locomotion, mesh importation, 3D asset manipulation
Get the App here!
Tutorial for the App here!
HW 3/9 - HW 3/14
total: 13 hours
Testing on Texture Mapping (3h)
Learnt how to do texture mappings, so the meshes can have color :)
Contributing to Wiki (10h)
Writing 5 different wiki pages including:
total: 5 hours
HW 3/21 - 3/23
total: 6 hours
HW 3/23 - 4/04
total: 12 hours
Preparations of Project 2
(1)Narrow done the question: (3h)
(2)Find the dataset: (4h)
(3)Similarity Matrix (5h)
HW 4/04 - 4/13
total: 8 hours
key word: dataset preparation and coding
Preparations of the dataset: (2h)
Emotion analysis on the dataset: (6h)
HW 4/13 - 4/20
total: 25 hours
key word: VR fundalmentals design
Design and Construct Over Structure: (8h)
Fonts Editing: (6h)
Poetry Filling: (6h)
Multiplayer Implementation: (5h)
HW 4/20 - 4/27
total: 5 hours
key word: Finishing up
Made small adjustments on VRPoetry(3h)
Prepare for in class activities & handout: (2h)
HW 4/27 - 5/04
total: 10 hours
HW 5/04 - 5/17
total: 38 hours
total: 10 hours