CONTRIBUTION 1 Added link to an installation guide for Unreal Engine on Linux link
CONTRIBUTION 2 Started Setting up a basic VR scene in Unity3D tutorial [on pause] link
CONTRIBUTION 3 Comparative analysis of Unity networking plugins link
CONTRIBUTION 4 Immersed Class Activity/Tutorial link
CONTRIBUTION 5 Photon Pun page link
CONTRIBUTION 6 Normcore page link
CONTRIBUTION 7 Added to Immersed page (activity feedback + some info on plans and features) link
CONTRIBUTION 8 Old vs. New VR setup in Unity link
CONTRIBUTION 9 GeoGebra3D Class Activity link
CONTRIBUTION 10 GeoGebra3D Page link
CONTRIBUTION 11 VR @ Brown 2021 link
CONTRIBUTION 12 VRZone Page link
CONTRIBUTION 13 Arthur Page link
Total: 140 hours
1/25/21 - 5 Hours
Joined Slack
Set up this page
Read through the course timeline and project ideas
Looked at projects from previous years
Worked my way through some sections of the Wiki
Made small contributions to the Wiki
Set up Quest 2 and explored some features/apps
Did some background reading
1/26/21 - 5 Hours
Did readings off of the CLPS 0540 (Simulating Reality: The (Curious) History and Science of Immersive Experiences) syllabus.
Read more pages from previous years' wikis
1/27/21 - 5 Hours
Worked through more of the Quest 2 Setup Guide and got super stuck
2.2.4 - Sidequest is not recognizing headset.
I turned developer mode on and installed the driver like the instructions said but I think something may have gone wrong with the driver installation because it doesn't show up under
Potential pieces of software to explore:
Unity
Software that lets you set up VR art exhibitions
Seems relatively easy to use, has a Photoshop-esque UX, no code
Videos on the website depict the process of creating a gallery but it's unclear what visiting one is like and whether the viewing process can be made collaborative
Google VR Development platform
CryEngine
Potential project ideas:
Designing user experience for Virtual Reality art crit in Unity.
Note to self: circle back to these articles on Cave Painting
End-to-end collaborative survey design and data visualization tool using Unity
Comparing collaborative plug-ins across Unity, Maya, other software
Not really a fully formed project but I found a paper titled 'Scientific Sketching for Collaborative VR Visualization Design' so I definitely plan on reading that to see if there's any gaps the authors identify that I could potentially make a project out of.
1/28/21 - 1 Hour
Posted my issue on Slack and got it fixed (thanks Ross!)
Finished Oculus2 set-up
Went on Google Earth VR and traveled around
1/31/21 - 3 Hours
Watched Unity VR tutorials on YouTube to get a better idea of what the software is and how it works. This will help me understand how complex development is and gauge what level of complexity to aim for with my project.
Some of the most helpful ones were:
Introduction to VR in Unity - Part 1: VR Setup
How to set up a new VR project in Unity
Basic object design, manipulation, and interaction
Introduction to VR in Unity - Part 2: Input and Hand Presence
Good examples of how code factors into creating experiences in Unity
Gave me abetter understanding of Unity's layout and the different aspects of building something there
How to Make a VR Multiplayer Game - Part 1
Uses a package called Photon Pun to create multi-user experiences
TODO: Add info about Photon Pun to Wiki
Goes over common display issues with multiple users
How to Make a VR Multiplayer Game - Part 2
Covers how to make an object move the same way for all players
Photon Pun apparently also allows you to share audio
2/1/21 - 4 Hours
Spent some time watching this tutorial about the Unity UNet networking plugin only to find out it was deprecated (oh no!)
Researched other Unity networking plug-ins to potentially compare to Photon Pun
So, it seems like there's another plug-in called Normcore. This one's more new and isn't used as widely so it will likely be frustrating to use but any teaching materials I make in it might among the first, which is exciting!
Video comparing Photon and Normcore at a surface level
Project Ideas:
Comparative analysis of Photon Pun and Normcore
3 Things I will do during the project:
Download and play around with both Photon and Normcore
Compare both plug-ins on a range of factors
Create and populate pages for both
Class Activity:
Download one of the plug-ins and build something basic, like a lobby with voice share and objects to play with. Pair up and experience those rooms together.
Potential Deliverables:
Wiki page for each of the plug-ins
Comparative wiki page
Tutorials for both plug-ins
Adding multi-user viewing functionality to OpenSpace3D
3 Things I will do during this project:
Download and familiarize myself with OpenSpace3D's codebase
Research how to integrate multiple users into the viewing experience
Develop new functionality that allows multiple users to explore a model together
Class Activity:
Downloading OpenSpace3D, playing around with some of its basic functionality, brainstorming what we might like to see or be able to do in a multi-user version
Potential Deliverables:
Design doc
Development roadmap
Codebase
Allowing multiple users to view a Dataverse.xyz creation simultaneously and discuss it over voice-chat
3 Things I will do during this project:
Download and familiarize myself with Dataverse's codebase
Research how to integrate multiple users into the viewing experience
Develop new functionality that allows multiple users to explore a data visualization together and discuss it
Class Activity:
Make your own Dataverse viz
Potential Deliverables:
Design doc
Development roadmap
Multiple user functionality
Voice functionality
2/3/21 - 3 Hours
Project: comparative analysis of Unity multi-player VR plug-ins (Photon & Normcore)
Milestones:
2/9 - decide on criteria for comparison, brainstorm and plan what to build with both
2/11 - download photon, complete planned exercises with it
2/18 - make photon wiki page
2/23 - download normcore, complete planned exercises with it
2/25 - make normcore wiki page
3/2 - make comparative wiki page
3/4 - plan exercise for the class to complete in photon, normcore (or both)
Read this paper about the CAVE
2/5/21 - 3 Hours
Did more background research on Photon Pun and read documentation and sample code
2/8/21 - 5 hours
Installed Unity on my virtual device
Rewatched a Photon Pun tutorial
Found and read this report by Unity comparing the different networking plugins available
Ran into potential issues with my project and spent some time weighing whether or not to switch to something else
Prepared presentation for tomorrow
Practiced presentation
Worked on setting up the basic scene in Unity by following this tutorial
Having trouble getting the user's hands to show up, will circle back to this
o The proposed project clearly identifies deliverable additions to our VR Software Wiki. 5
o The proposed project involves collaboration in VR 5
o The proposed project involves large scientific data visualization along the lines of the "Data Types" wiki page:
https://sites.google.com/view/brown-vr-sw-review-2018/course-archives/spring-2020/data-examples-and-collaborators and identifies the specific data type and software that it will use 2
o The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class 4
o The proposed project includes an in-class activity true
o The proposed project has resources available with sufficient documentation 4
2/9/21 - 3 Hours
Did more background research on Normcore and read documentation and sample code
2/10/21 - 4 Hours
Downloaded Paraview
Continued working through this tutorial to get the basic scene set up
Got stuck getting the controllers to show up, so I switched over to this tutorial for that
So weird, sometimes when I look at a specific angle the object i picked to represent controllers (a spehere) is there, but more often than not it's nowhere to be found, when it's really supposed to be following the controller on the screen. I'm just confused where I'm going wrong because I've followed all these tutorials step by step but I can't get the controllers to show up :/
When I print out the controller's coordinates, they seem to be moving in sync with my hand movements, which means it's being tracked correctly, but it's not being rendered
I'm pretty far behind where I'm supposed to be (setting up basic scene still when the goal was to have all 3 scenes set up by tonight), might have to re-think project a bit. The paths I'm considering:
Compare Normcore and Photon Pun across just the basic scene
Spend more times on scenes and do an analysis of just one of the plug-ins
A Photon Pun page might be more useful than a Normcore one since Photon is more widely used, BUT there also exist many more online resources for Photon than for Normcore, so Normcore information/tutorials would probably be more useful if someone were to use it in the future
Download pre-made scenes from the asset store (didn't know these existed till today woah) and just focus on making those multiplayer instead
One challenge with this is that making the imported scenes multiplayer might be more challenging without having built them
Logging off now, pretty much where I was when I started :/ I think I'm going to keep working through this to set up the basic scene and use that scene to test Photon and Normcore. If time allows I might use more complex scenes from the asset store.
2/17/21 - 8 Hours
Getting user's hands to show up in VR:
Found this tutorial for setting up VR in Unity for the updated version.
Downloaded the new version and tried it but ran into issues. After some time struggling with it, decided to switch back to an older version.
For some reason, my headset and Unity are no longer connecting to one another
Getting more and more likely that I won't be able to set this up in time. If 1 more hour goes by and I haven't made progress, I'll shift my focus to turning a pre-existing Unity scene multiplayer. The reason why I wanted to build my own scene and turn it multiplayer is that I wanted to have a full understanding of the code by having written it, but I can still definitely just work on integrating Photon Pun and Normcore into those; it'll be more challenging but not impossible.
Got the hand controllers to show up FINALLY!!!!!!!
Switching the XR controller from action-based to device-based seems to have done the trick
Right now I have two spheres representing the users' hands. I've gotten them to follow the user's movements, but they're very large and show up above the user's head.
I changed the size of the spheres to 0.1 from 1 on all axes but they're still showing up very large for some reason 🤔
I found this answer to someone who was having the same issue as me for the same video. Not sure I fully understand the fix.
Hm, ok. The only thing I could find that would get the size of the hands to change is the [LeftHand Controller] Model that comes up as a child of the LeftHand Controller (ditto for RightHand) once you press play, but the size of this resets every time you hit play again.
Apparently, this thing that appears in square brackets is a script. Now the question is how do I get to that script to edit it.
Found the script (it was in the script folder, duh). But I couldn't find the part that controls the initial size of the controller. It seems like the script deals mostly with detecting the controllers, not so much with representing them.
Found some awesome hand prefabs that show up correctly and are an appropriate size yay
Finally got the hands to show up at a normal size (they're also interactive which is pretty cool!!)
Set up a Virtualitics account but ran into an error trying to generate an Oculus Quest Code:
'User does not have any mobile seats'
Started working on a tutorial for the Wiki on how to set up a scene like the one above (interactive VR hands in Unity3D)
2/22/21 - 5 Hours
Cleaned the journal up a bit
Added some steps to the tutorial
Started running into issues with getting scenes to display properly in the headset which I've never had before??
For example, for the scene in the video above, typically I would just connect my laptop to the headset and click the play button in Unity, and the scene would just show up in my headset. Now when I do that, my headset displays my desktop still and not the scene I built.
If I start SeamVR first, it takes me to the SteamVR environment and only lets me view my desktop as a smaller window within that. This view also doesn't display my scene.
Also sometimes ran into an error that read 'Unable to start Oculus XR Plugin'
I'm very confused by my sudden inability to view what I build in Unity on my headset. It's a very basic and fundamental thing and was working just fine when I first started working today. I thought maybe there was some issue with how I set up XR within Unity, so I started over with two new projects following this and this tutorial. Even with these new projects, I could not get my scenes to display in the headset; I was still getting stuck in the same loop of needing to turn on SteamVR to be able to view in VR but then being stuck in the SteamVR environment and losing full access to my desktop.
I think I've just been staring at the same few things for too long and hopefully when I pick this back up later this week it sorts itself out or I figure out where I've been going wrong.
Installed VRChat and signed up.
2/23/21 - 2 Hours
Continued debugging efforts
Peer Review for Laila's Journal by Melis Gokalp
Journal activities are explicitly and clearly related to course deliverables: 5
deliverables are described and attributed in wiki: 4
report states total amount of time: 5
total time is appropriate: 5
Self evaluation
Journal activities are explicitly and clearly related to course deliverables: 5
deliverables are described and attributed in wiki: 5
report states total amount of time: 5
total time is appropriate: 4
Note: I evaluated Michael's journal
2/24/21 - 5 hours
Followed set-up instructions for Shenandoah's activity
Kept running into the same error that reads 'Unable to start Oculus XR plugin'
Went on a variety of discussion forums, none of which proved helpful
I thought maybe the issue is that the version of XR input support I used for it had been deprecated so I tried re-implementing using the newer version of XR input, but I still ran into the same error
Shifted my focus towards the wiki:
Made a comparative analysis page looking at all the different networking plug-ins I could find
Started a Photon Pun page
Started watching this video to get a better understanding of how networking actually works in Unity
3/1/21 - 4 hours
Brainstormed in-class activity
Immersed is a cool app that lets you share screen and control your laptop collaboratively in VR
You can put your keyboard in VR and control it with your hands!
You can also video chat as your VR avatar.
In-class activity: do a short pair-programming exercise in the app, followed by a video call as a class to discuss
3/3/21 - 5 hours
Set up tutorial for tomorrow's activity.
Installed Unity3D locally.
Followed instructions I've written so far in my Unity tutorial to get to the point where I can build off of what I've written
Followed Hiloni's set-up guide
3/8/21 - 3 Hours
Made presentation for project 1
Brainstormed remaining pages to complete by this week for project 1
Finished Photon Pun page
Got started on Normcore page
3/9/21 - 3 Hours
Added to Immersed page
Continued working on Normcore page
3/10/21 - 3 Hours
Finished Normcore page
Started new vs. old Unity XR page
Running into issues with running Oculus locally for some reason (something about not being able to find the path?)
3/15/21 - 5 Hours
Finished 'Old vs. New Input System' page
Troubleshot the error I get when I try to run Unity locally
Tried deleting and reinstalling (did not work)
Not finding much :/ A lot of the solutions are for Windows
Brainstorming for next project:
Want to do something that incorporates data visualization somehow
Having trouble brainstorming what kind of data
Maybe a 3D grapher? Could make it collaborative by having multiple people be able to view the same graph
It seems like VR 3D graphers already exist, but aren't collaborative
Building a 3D grapher from scratch feels like it would be difficult and time-consuming, but maybe I can find existing code, make it work in VR, and then make it compatible?
Found this project outline that a student in a previous iteration worked on
Found this package which seems similar to what I want, but not quite
Project 2 Draft (Idea 1):
Idea
Collaborative 3D graphing software in Unity
Things I will do
??? Need to do a lot more research for this one
General plan is to find open-source 3D modeling software, translate it into VR and then make it collaborative?
If I can find a Unity package for making 3D graphs and then focus on making it collaborative, that would be ideal
Activity
Have folks try my project, or try another (probably non-collaborative) 3D graph app in headsets and discuss features/experience
Deliverables
Unity Project
Insttructions on how to download/use
Maybe a tutorial on how I made the project?
Project 2 Draft (Idea 2):
Idea
Evaluate different Brown class' collaborative VR needs and come up with solutions
Things I will do
Survey the current use of collaborative VR in classrooms, knowledge of it, and gauge interest
Hone in 1-2 classes or research labs to get a deeper insight into their needs
Come up with solutions, or blueprints for solutions for how these classes/labs can incorporate collaborative VR
Activity
Try out one of the solutions and evaluate it
Deliverables
Survey results
Activity plans for classes (software, instructions, plan, etc.)
3/17/21 - 1 Hour
Posted my issue on Slack
Worked on presentation
3/18/21 - 4 Hours
Evaluated peers' proposals
Worked on survey
3/19/21 - 3 Hours
Continued trying to solve Unity issue to no avail
3/24/21 - 3 Hours
Finished survey
Started brainstorming people to send the survey to
3/28/21 - 5 Hours
Downloaded Arkio (Paul's activity) and completed in-app tutorials
Shared a link to the survey across student groups and research labs as listed in this spreadsheet
Taregting students was relatively easy because the channels for reaching them are familiar, still want to brainstorm more/better ways of reaching faculty and researchers
3/29/21 - 3 Hours
Continued sending the survey out
Shifted focus more toward courses and their professors
Besides Today@Brown, what is a good way to reach faculty?
3/30/21 - 1 Hour
Survey promotion
3/31/21 - 3 Hours
In-class activity prep
Explored Tripp VR app
4/4/21 - 2 Hours
Read over new survey responses
Played around with GeoGebra AR app and brainstormed potential class activity
4/5/21 - 3 Hours
Prepared GeoGebra3D in-class activity
Made an update presentation
Started GeoGebra page
4/6/21 - 1 Hour
Prepared for others' in-class activities
4/9/21 - 2 Hours
Finished Michael and Shenandoah's in-class activities and did their surveys. Ran into some technical difficulties along the way which made the activities take longer.
4/12/21 - 5 Hours
Finished GeoGebra page by adding feedback from in-class activity
Started working on summary page for survey results
Some of the faculty members who filled the form ended up on the student version due to a glitch, so I had to spend some time separating the data and manually making graphs because the ones in the Google form results were no longer accurate
4/13/21 - 5 Hours
Continued working on VR @ Brown page
Made data visualizations for survey results
4/14/21 - 6 Hours
Finished VR @ Brown page
Made final presentation
Cleaned up journal and accounted for missing assignments like journal reviews etc.
Revisited Spatial and explored the app's functionality and potential use for art and museum settings
4/20/21 - 9 Hours
Did some research into the use of VR for gallery and museum use. It seems like right now there are a lot of applications for AR in museums and galleries as a supplement to written text or to reveal extra layers to art pieces, but I couldn't find a ton on entirely VR-based galleries or museums.
Some cool findings:
Explored Acute Art, the AR art application mentioned in the NYT article
The app seems to mainly be for viewing AR art, not creating it
The art pieces were all quite slow to load despite good Internet connection. Would be curious to learn more about how they are implemented and to examine the source of the lag
The app has you place a pin where you want the model to be grounded, which I thought was a cool UX feature that could be useful for a variety of applications, especially ones with VR/3D models
Looked up 'museum' in the Oculus store
Found and watched video tours of actual physical museums
The Anne Frank house had its own app on the store which was super fascinating to explore. The entire house has been recreated in VR
The app has both a narrative mode, which is somewhat like a guided tour of the house, and an explore mode, which lets you walk around on your own. The explore mode did not work for me.
Did not find any applications for making your own gallery or museum. Also did not find any apps that mirrored the real-world museum-going experience, which somewhat makes sense because instead of showing things in 2D or a plaque developers can immerse you in the content, but I would be interested in seeing what that could look like. Could also be cool to have a mixed experience, like walking around a museum room in VR but when you click on something it takes you into an immersive version of it.
Found this video of a VR art gallery using services from the company Gigoia Studios which offers various VR services, including VR galleries for artists
Downloaded and tried VRZone
The app has a variety of galleries you can buy tickets to
UX is kind of poor, very unclear how to turn around in the app or if that's even possible and walking is done in relatively small increments
Galleries are/can be multiplayer, which is cool!
You can lease galleries in the app by applying through a form on the company's website
Got a ticket and went to the Banksy exhibit. It was pretty cool, the gallery is a small VR town where you can walk around wherever and Banksy's pieces displayed as murals on the side of buildings
Made a wiki page for VRZone
Downloaded and explored Arthur
Added a page for Arthur under the Collaborative Workspaces page
Finalized and practiced final presentation
Factory reset headset