Yuanbo's Project1

Project: Recreating Real World Objects in VR

In Class Activity Link

Evaluation Results



Goal:

We want to explore how to recreate real world objects in VR, and manipulate them.


In Class Activity:

before class (estimate time: 10mins for students, 3 hrs for me):

students can take various images of a single object from different perspectives and send them to Yuanbo. Yuanbo will use Nerf and recreate a 3D model in VR to be shown in class time.


In class (estimate time: 10mins) : 

1)Students can compare if the resulted 3D model in VR looks like the original object

2)Students can go inside the object, as predicted by VR. As we cannot go inside the object in real world, VR (might) help us imagine what looks like inside the object.


Collaborative Functionalities:

One user can take various images of an object, and another user can view the recreated result. 


Data Format:

The data produced by NeRF should be in Point Cloud format, with each point available for further predictions.


Evaluations:

1)Result authenticity. The students should compare the visualized result in VR with the original object in real world. From 1 to 5, students evaluate the amount of authenticity

2)Object inside prediction. Students can also explore what is inside the object in VR. They should evaluation if such result align with their expectation.


Documentations:

Nerf: Nerf paper on arXiv

Point Cloud: point cloud is a popular 3D model presenting technique, its tutorial can be found on many websites. I'll use PyRender and its export to point clouds.

Unity3D: official unity tutorial, and also Unity Point Cloud plugin 


1)Read NeRF paper and get familiar with it 

2)Prepare the 3 mins presentation

1)Implement input image processing to be used on NeRF

1)Implement the fully-connected neural network for NeRF

2)Do tutorial on Unity 

1)Optimization of training method, prepare a workable NeRF on computer

2)Do tutorial on Unity Point Cloud Plugin

3)Use Pyrender to export NeRF produced data into Point Cloud

1)Use Unity Point Cloud Plugin to read point cloud data on computer

2)Make a working version of the system 

1)Finish VR data visualization, prepare demos/ class activities

1)Contribute to wiki with data: connec Neural Network 



Wiki Contributions:

1)Introduction to recreating real world data in VR

2)Comparison between photogrammetry and NeRF

3)Tutorial on Nvidia Instant NeRF

4)Comparison between techniques to export mesh to VR

5)Texture mapping in Unity


Survey Analysis:

Q1

Question: Do you prefer VR NeRF or 2D Images?

Result: 58% people (7/12) prefer VR NeRF, 42%(5/12) prefer 2D Images

Q2

Question: What do you value most in Real World recreating in  VR?

Result: participants will rank the importance of "immersive", "collaborative", "Ease to use", "Object Realism", and "Render Quality". 

The category ranked most important will gain a score of 5, and least imporant a score of 1. (second importnat, score of 4, etc).  

Q3 & Q4

Question: What do you prefer in Real World recreating in  VR than in 2D images?

Some answers:

1)We can better experience the depth in VR.

2)We can assign gravity/ forces to 3D objects. 

3)We can (possibly) interact with friends with 3D objects together.


Question: What do you prefer in 2D images than in Real World recreating?

Some answers:

1)It's much easier to view and load. (you don't need to sidequest load, open Oculus Quest, etc...)

2)The image realism is much much better


Further Works based on the Survey:

Many of the survey results point to increasing result realism with VR. And I have explored in two directions(both included in the wiki):

1)Try a different recreation algorithm : Photogrammetry

2)Try adding colors to the object: Texture Mapping