Paul Molnar's Journal

PROJECT LINKS

Project 1 Links:

Project 1 Plan: Plan Document

Presentation for Project 1 Proposal: Proposal Presentation

Presentation Update for Project 1: Update Presentation

In-class Activity 1: In-class Activity

End Presentation for Project 1: End Presentation

Project 2 Links:

Project 2 Plan: Plan Document

Presentation for Project 2 Proposal: Proposal Presentation

Presentation Update for Project 2: Update Presentation

In-class Activity 2: In-class Activity

End Presentation for Project 1: End Presentation

Final Presentation Links:

Flash Talk: Flash Talk Slides

Final Poster: Final Poster

WIKI CONTRIBUTIONS

Pre-Project Contributions

Project 1 Contributions:

Project 2 Contributions:

  • 3D Cave Generation: created page on how to create 3D density function data and display it in Unity using marching cubes

  • Underwater 3D Cave Exploration Project: created a page that details how to load and run my second project. It is a used as an example for the 3D Cave Generation page

  • Character Colors: created a page on how to set up a system to allow for players to choose their own colors while the application is running

  • Custom Avatars: created a page on how to create custom avatars by editing existing multiplayer tutorial objects linked inside of the page

HOURS SUMMARY

Total: 160 hours

HOURS journal

1/30/21 - 5 Hours

Changes that could be made to last years wiki:

  • 10 minute changes to last year's wiki

    • In the "What is VR for" section, only the first two links have outlines while the next ones do not. Adding these outlines wouldn't take very long since they are just small paragraphs.

    • In the "VR in Education" section which is a subsection of the section in the bullet point above, the beginning blurb is named "Introduction" instead of "Outline" like many of the other ones, so this could be changed to keep constant format.

    • The software compatibility matrix in "VR Hardware" is not big enough to need a scroller on the side and could just be a picture instead to make it look nicer.

  • 1 hour changes to last year's wiki

    • The picture in the bottom right of "VR Gallery Software (VRZone) doesn't have the appropriate permissions for me to view it. Since permissions can be a pain I think this could take an hour to fix.

    • In the home page of "Related Technologies," only some of the links to the subsections are there. It is missing about three or four links that are on the right that can't be found on that page. To add those links and right blurbs for each part could take about an hour.

    • The "Applications of VR" homepage is completely empty. It would be nice to see some links to the subsections of it so that it keeps the same format as most of the rest of the wiki.

  • 10 hour changes to last year's wiki

    • I would've liked to see pictures of some of the demos for the Unity3D portion on the main page. This could require people to run the demos again to get screenshots which could take 10+ hours.

    • The last point can be extended to most of the VR Development subsections. I feel like one of the biggest tools to engage a reader is visuals, and many of the subsections fell short on this.

    • The overall format of the wiki is very different over the different sections. There are many links and headings that are just headings and have no links beneath them on the page and you have to navigate through the sidebar on the left to find further links. Fixing this could take a long time.

  • My edit: Changed "Introduction" to "Outline" in the "VR in Education" section to keep constant format.


The Rest of the Homework Assignment:

I decided to look into Unity3D for most of my logged time because I am very interested in game development and have been building a 2D game in unity for about a year now. Collaborative virtual reality games seem so cool to me and so I decided to check out Unity3D to see if maybe I could build a game in VR for my first project and then maybe add a collaborative aspect to it for my second project. This doesn't really take into account the data visualization part of the class though so here is what I came up with for how I could incorporate that into the game.

  • A puzzle game that relies on careful management of resources. Some of these resources could include energy, oxygen, and time. The only way to see these resources is head to another room where the data would be displayed on graphs. The graphs would keep a constant timeline of resources.

  • It could almost be kind of like an escape room, but by accomplishing things before your resources run out could get you more resources and allow you to continue playing. This would force the users to keep careful track of their resources by visiting the data visualization room.

  • I understand this is not the kind of data visualization that was shown during lecture and the kinds of problems that the course may ultimately be trying to solve, but I am really more interested in games as apposed to simulations from data. I think VR can definitely enhance games and make them incredibly more interactive, which could benefit all sorts of people as the gaming industry has done nothing but go up recently.

  • Building a 3D game would also require me to learn some kind of 3D design software such as Blender or Maya, or I could collaborate with someone else in the class who is interested in design to make me game object models. Either way it gets me more involved in VR and the class.

  • If I did build a 3D game for my projects I would be very invested in it and would be guaranteed some form of product at the end of the class. Building games is nice because it is not that hard to get a base for a game and there are always more things that you can add to it so you can't finish it early.

I watched a bunch of YouTube videos on different VR games to see if any were especially cool to me and I loved some of the aesthetics of games like Beat Saber but don't see how building a game like that would fit the class very well since it is neither collaborative nor visualizes data.

I also read through most of last years wiki to see if anything else caught my eye, and the 3D design software also seemed pretty cool. If my original project idea isn't what the class is going for I would be up for maybe doing an analysis of different 3D design software. I could try them all out and see which ones are easiest/intuitive and which ones give the best results. There is still plenty of time to plan my projects though so I don't want to rush into anything just yet.


2/2/21 - 5 Hours

  • Set up Oculus headset. I have a cord and the proper processing power so I connected my headset via the cord to my computer instead of having to set up a virtual machine. It's much simpler than the other method and gives good results, with the only tradeoff being that I have to connect to a cord. I also can use this method after the class without having to worry about getting virtual machine access.

  • Software to explore/evaluate for final project: Unity, Blender, UnrealEngine, Maya

  • Project Idea 1: design terrain exploration of Mars using surface maps. The surface maps can be done in pretty much any of the software since they are all building tools. The data wouldn't be hard to find, but the problem I'm trying to solve is still unknown. I have to do more research into what people are studying on Mars to see if this visualization would be legitimately helpful in solving a problem.

  • Project Idea 2: My grandpa had COVID back in December and when he got out of the hospital he had to be on oxygen, due to the damage to his lungs (he had a lung transplant also) and he believes that his heart rate is now constantly elevated. We decided to track several vital signs for a few days such as: heart rate, pulse rate, oxygen saturation and activity status. Using this data we could see if his heart rate is elevated all of the time or just during activity. This might be too easy of a problem to solve and VR might not make it easier, but it could be cool to see the 5 different variables tracked with each other in a virtual space while trying to solve the problem.

  • Project Idea 3: When I was thinking about COVID I realized that it is hard to tell the spread of it throughout the US by just looking at numbers and the 2D graphs with circles at the high population centers. It would be cool to have a 3d representation with peaks at the places with cases that would run as a short video over a couple years. If you connected the peaks it would look like a terrain which then changes with time which I think would be a very unique virtual representation that could you could gain a lot of insight from in VR.

  • I also explored several apps for a couple hours most of which were classic VR games like Beatsaber and Superhot. I also tried out YouTube VR and Netflix VR for some video and TV show watching. It is a completely different experience being able to sit in a virtual movie theater watching a TV show rather than on my phone in bed.

2/6/21 - 5 Hours

  • Project Idea 1: COVID data visualization in 3D over map of US. COVID data in 2D in the bubbles is harder to visualize and can be hard to tell where it is spreading (good for seeing high densities of COVID though). How can we tell where COVID is spreading and likely to spread using 3D data visualization software like ArcGIS 360 VR, FieldView, and ParaView.

    • Things I will do: model COVID data over time in the different visualization models listed; Compare results of the visualizations in the different software for software analysis; Get other people's opinions on the visualizations through collaboration.

    • Class activity: look at visualizations to see if they are more effective than 2D maps at showing spread.

    • Deliverable: ArcGIS tab in VR data visualizations to show if the software is worth using and how effective it is.

  • Project Idea 2: Documenting the difference between two hemispheres on Mars using a virtual space. There isn't much out there about this topic and it would be interesting to take surface data points and compare using VR data visualization software like the ones I mentioned in the project above (except ArcGIS wouldn't work because that is for Earth.

    • Things to do: model lowlands of Mars and compare with model of highlands of Mars; Compare results of the different visualization software; Get feedback from other students on the visualizations

    • Class activity: analyze Mars maps to see if they are helpful VR tools for documenting the difference between the faces of Mars.

    • Deliverable: map of the two hemispheres of Mars with images of the differences. Table documenting the differences in software of the maps.

  • Project Idea 3: I still would like to build/code a terrain using Unity, but I don't know if I can make it work for the class. If I decide on project 1 and I have extra time it would be fun to try and use Unity (which I have a lot of experience with) to build the data visualization instead of ArcGIS to make it more customizable and tailored to what I am imagining. It would still use the rest of the project but would have more software to compare with.

  • Software Evaluation Metrics: I will probably make a table with the different software I use with different criteria and score them based on those criteria against each other. This is a relative analysis so it might not work the best but I think it could be helpful. Some of the criteria could include ease of use, ability to visualize, and collaboration.

2/9/21 - 5 Hours

  • Project plan milestones

    • 2/15: Finalize data and find and test software to see if it will work with my idea. Deliverable: data and list of software to use for visualization

    • 2/17: Import lowland Mars data into one of the software to try and get the visualization working. Deliverable: screenshot of data inside the software.

    • 2/22: Do the same as 2/17 but with the highland data in a separate visualization. Deliverable: screenshot of the data visualized

    • 3/1: Combine the two visualizations into one singular visualization to compare the two in the same space. Deliverable: screenshot of the two data in the same space

    • 3/3: Try other software to see if it works better than the original software. Deliverable: screenshots of attempts at visualization in other software

    • 3/8: Add collaborative elements whether that is allowing for saving images and sending them to peers or getting someone else into the same space. Deliverable: screenshot of collab element working.

    • 3/10: Fully functioning visualization with analysis results. Deliverable: a working visualization that can be replicated on others machines.

  • Finished DinoVR exercise and added the 7 screenshots to the Google Doc.

  • Watched several videos on how to improve performance of SteamVR over Virtual Desktop and spent a long time debugging problems. Eventually I came up with an arrangement of the settings that allows my headset to run smoothly over the connection and it is even comparable to the wired connection quality wise (not quite latency wise though).

2/14/21 - 5 Hours

  • Added a question to the google doc for class on the 15th.

  • Finalized project plan

    • 2/15: Completed milestone of finding and downloading data for highland and lowland Mars. Researched how to add data to Unity (the software I decided on) to visualize it as a terrain.

    • 2/17: Import lowland Mars data into one of the software to try and get the visualization working. Deliverable: screenshot of data inside Unity.

    • 2/22: Do the same as 2/17 but with the highland data in a separate visualization. Deliverable: screenshot of the data visualized inside Unity.

    • 3/1: Combine the two visualizations into one singular visualization to compare the two in the same space. Deliverable: screenshot of the two data in the same space.

    • 3/3: Add visual improvements to the combined data visualizations. This can include coloring the terrain by height, and adding it as a texture. This will also include chunking if the files are too big to render smoothly.

    • 3/8: Add collaborative elements whether that is allowing for saving images and sending them to peers or getting someone else into the same space. Deliverable: screenshot of collab element working.

    • 3/10: Fully functioning visualization with analysis results. Deliverable: a working visualization that can be replicated on others machines with downloads similar to DinoVR tutorial.

    • DELIVERABLES: data section on how to get Mars elevation data. New page with a tutorial on how to use install and use my program.

    • ACTIVITY: class installs and uses my program during class to explore and compare highland and lowland Mars terrain.

  • Questions about my project

    • (AGREE) The proposed project clearly identifies deliverable additions to our VR Software Wiki

    • (AGREE) The proposed project involves collaboration in VR

    • (STRONGLY AGREE) The proposed project involves large scientific data visualization along the lines of the "Scientific Data" wiki page and identifies the specific data type and software that it will use

    • (AGREE) The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class

    • (AGREE) The proposed project includes an in-class activity

    • (AGREE) The proposed project has resources available with sufficient documentation

  • Spent several hours finding the correct way to download Mars elevation data onto my computer in a usable format.

  • Built my 3 minute presentation for 2/15.

2/16/21 - 5 Hours

  • Spent most of my time in this entry working on my project to turn the Mars data into a .png and then from there into a .raw file using the image editor Fiji. From there I uploaded it to Unity and applied it to a terrain as a heightmap to give me my first visualization. I cropped the image earlier to roughly a 45 degree square latitude and longitude section. Doing this completed my first milestone.

  • Self journal evaluation for 2/17.

    • I include all of the time I spend on this class in my journal which I'm realizing might not be what I am supposed to be doing after looking at the rubric. I haven't made any deliverables in the Wiki yet because I am not very far along in my project (just one milestone). This means that later in the semester I will add lots of stuff to the Wiki but don't have enough info, research, or progress to be able to add my meaningful contributions yet.

    • Journal activities are explicitly and clearly related to course deliverables

      • Score: 3, Reason: I don't really have any wiki contributions yet besides the first day but am making good progress so far on my project which will result in many wiki contributions an.

    • Deliverables are described and attributed in wiki

      • Score: 2, Reason: I only have one wiki contribution and while I did describe it and link it, there is not enough things that I have changed for there to be deliverables described.

    • Report states total amount of time

      • Score: 5, Reason: I clearly state the amount of time I spend doing assignments and writing in my journal.

    • Total time is appropriate

      • Score: 4, Reason: While I do have a lot of time logged in my journal, I'm not sure if the journal time is supposed to be related only with wiki contributions. If it is then I have much less time logged and need to work on that.

2/21/22 - 5 Hours

  • Created a new unity project for my application and installed necessary packages like UnityXR

  • Watched tutorials on how to connect Oculus Quest 2 to my computer to run it in the game

  • Watched tutorials on how to synchronize my headset with an XR rig to get a character in the application

  • Completed my setup for the project and loaded my .raw (from .png) file from the last milestone into the application. The heightmap didn't have an absolute height though so my terrain was warped to be very high in the y axis making high peaks and low craters. I did research on how to solve this problem and eventually found the Unity Terrain toolkit.

2/22/22 - 5 Hours

  • Stopped working on the terrain portion of my application to analyze the feasibility of implementing a collaborative aspect. Found a 35 minute tutorial that I watched twice to see if it was doable.

  • Implemented this tutorial using a Unity package called Photon Pun. It is free and super simple requiring only a few 50 line scripts to allow collaboration. As long as someone has the application they will immediately be connected to the server on entry and will be able to interact with everyone in the application.

  • I spent some time building some avatars and importing hand assets so you can see yourself grabbing and pointing around. I plan on allowing you to grab the different textures but it is also just fun to see some floating virtual hands on the Oculus.

  • I finished the collaboration element of my project although until further implementation of my terrain it will be hard to see if changes in one person's application will also be present in the other application. The only thing that you can currently see of the other person is their avatar and movements.

2/23/22 - 5 Hours

  • Researched websites where I could get the absolute height of Mars given different latitude and longitude coordinates and couldn't find anything, so I came up with my own way to get absolute heights.

  • I decided to use places where the absolute heights are easily retrievable such as Olympus Mons and Hellas Crater. Each of these locations has details about the height and the height around them giving me accurate info on how to calibrate my height maps.

  • I downloaded this data from the website mentioned in the last checkpoint which took awhile because you have to submit jobs to the service and wait for them to send the data back formatted to your specifications.

  • I then uploaded both the Olympus Mons and Hellas Crater heightmaps into Unity and placed them next to each other. I currently have them both at square 4096 units in Unity which is very large and hard to render/explore so I think I will shrink them. Each unit in unity is worth 200 meters on Mars so it is unlikely that I will be able to get landscapes to look the way they look if you were walking around on Mars. I should be able to do the comparison like I want to though. Next up is resizing the terrains, applying textures and adding a starry skybox.

2/27/22 - 3 Hours

  • I just caught up on things I was missing for the class such as:

  • Added my Google Earth screenshots to this link: Link to Google Earth Screenshots

  • Peer review catch up

    • I peer reviewed with Alastair and Robert (there were three of us left so we all did it together). Overall their journals looked pretty solid. We all felt we should do a little bit better job putting stuff in our journal immediately after doing tasks so that we don't forget what we did.

  • Added my demo to the timeline for 3/15/22

2/28/22 - 5 Hours

  • Built my update presentation which required me to take screenshots of the different parts of my program

  • Explored options for adding textures to Unity terrains and was unsuccessful in finding a good way to do it although I know it is possible

  • Built my program and tried running it but it was pretty slow. I'm thinking of getting rid of the lighting which would increase my fps by a lot, and if I get the texture down it will already be shaded which would solve the lighting problem

  • Tried to test my collaboration implementation by running to things at once on my computer but it didn't pull up two sets of avatars. I am thinking that there is a bug in my collaboration implementation that I need to find and debug

3/2/22 - 5 Hours

  • Completed Beatrice's in class activity for homework

  • Rebuilt my terrain with smaller scales to reduce in game lag

  • Worked on debugging with Oculus Link

    • There is a problem with my computer for Oculus Link because it has 2 different graphics cards in it. One of them can run Oculus and the other one is built in to the computer and cannot run Oculus. Since Windows and Oculus are dumb they decide to run Oculus files on the worse graphics card

    • My work around for this was initially to manually select every single .exe file in the Oculus App and force it to run on the dedicated graphics card in the Windows Graphics settings but something changed and this method stopped working

    • My new work around is to disable the built in graphics card, restart my computer, run Oculus Link, and then reenable the built in graphics card. That way it forces everything that has to do with Oculus to run on the good graphics card

3/7/22 - 3 Hours

  • Prepped for Lucia's in class activity

  • Started trying to test my collaboration aspect of my project

    • It seemed to work but it is very hard to tell while without getting another person involved

    • I will most likely have to get someone else to help me debug so that I can see if the networking is correct

  • In the last in class activity some people noted that it would be cool to have other people show up as different colors in game so that you can tell who is who

    • I really like the idea and have started brainstorming ways to make this happen

    • I tried out one method that didn't work because of the way Photon Pun 2 is set up but will try again

3/9/22 - 5 Hours

  • Prepped for Mandy's in class activity

  • Prepped for Tongyu's in class activity

  • Tried another way to give everyone different colors in a game and it also didn't work

    • This task is shaping up to be much harder than I originally thought it was going to be because Photon Pun has a mesh structure where every person's application has to keep track of every other person's application

    • This makes it so that you cannot just access the color of another player because you simply have a placeholder for the player that knows nothing about it besides its transform during each timestep

    • I will probably stop working on this because it looks like it is going to be a time sink

3/14/22 - 10 Hours

  • Finally tried running my application on PaperSpace and realized that it won't work because it's not connected to Steam VR

    • Had to build an entire new Unity project using a Steam VR character instead of Unity XR character. The Steam VR character still uses the XR system but connects the entire project to Steam VR on launch which allows for it to be run through Virtual Desktop

    • I also then had to set up my controls and rebuild my movement using the Steam VR Input Window which could only be done if I connected my headset to my computer through XR Link (which I mentioned in a previous journal entry, is a pain)

    • This switch forced me to build every aspect of my project a second time including the networking which was annoying because there are a lot of parts in the project that are not code and require careful setting management

  • Once I had my project copied over I assigned my terrains a color (brownish) and built a black bridge connecting the two so that there would not be a huge gap

  • I then built it as a Steam VR project and created a tutorial for my in class activity

3/16/22 - 12 Hours

  • I fixed the networking bug that was stopping my in class activity from being collaborative

  • I built three new wiki pages which I have linked in the top of my journal under changes to the wiki

    • Tutorial for my program

    • Page on generating terrains in Unity

    • Page on building a multiplayer Unity game for Steam VR

  • I updated the Photon Pun 2 page to replace the outdated info with the correct stuff

  • I replaced the build in my in class activity page with the new bug free version

  • I made my final presentation for class the next day which required me to get videos so I got my brother to get into VR with me and I did the in class activity again

3/21/22 - 5 Hours

  • Read the 7 scenarios paper to learn what the different scenarios with

  • Here are the 7 scenarios to know (the first four are for data analysis and the last three are for visualization)

    1. Understanding environments and work practices (UWP) - formal requirements for design

    2. Evaluating visual data analysis and reasoning (VDAR) - ability to support visual analysis

    3. Evaluating communication through visualization (CTV) - how well does the visualization communicate the data

    4. Evaluating collaborative data analysis (CDA) - does the tool allow for adequate collaboration

    5. Evaluating user performance (UP) - how do features measurably affect user performance

    6. Evaluating user experience (UE) - does the user enjoy what they are doing

    7. Evaluating visualization algorithms (VA) - quality of visualization algorithms

  • Drafted my plan for project number two and connected it to the seven scenarios

PROJECT PLAN

  • I plan on exploring collaboration further using Photon Pun (or something else if I'm feeling ballsy) to try and create a game like experience of exploring a simulated cave. Players will spawn in and be given one stretch of the cave to navigate with the directions. This stretch will be dependent on the order at which players spawn into the application. A player will have to say they have the directions and then lead the rest of the players through the cave behind them in order to get to the final destination.

  • Data: The cave can be done using the Marching Cubes algorithm which takes an implicit density function and turns it into a 3D mesh. In this case the data is a a 3D density function that will determine whether a part of the cave is solid or empty. This is an interesting take on data because anything can be simulated using marching cubes as long as it has a density for a given point (x,y,z)

  • Collaboration: Players will be able to see each other and interact by exploring the cave with each other

  • 7 Scenarios Strengths and Challenges:

    1. My take on the class is analyzing VR for development as apposed to analyzing current VR data visualization programs. Therefor if my the program I can develop satisfies the seven scenarios, it means that the development software is adequate.

    2. The scenarios 3, 4, 5, 6, 7 will be very easy to evaluate given my program. 3 is basically just how well the VR can display the density functions. 4 is how collaborative my program is. 5 is how well my program performs. Is there lag or bugs or annoying mechanics? 6 is if the user actually enjoys the experience inside of the program. And 7 is if the Marching Cubes algorithm does a good job at visualization. All of these questions can be answered after using my program.

    3. The 1st and 2nd scenario will be more focused on Unity as a whole for development. 1 is asking if a group of people can perform the analysis together well. 2 is asking if Unity as a whole can support visual analysis. Both of these questions can be answered by me after using Unity to build a visualization/game.

    4. I believe that I will have analysis results on both Unity and my program I build for all 7 scenarios and will be able to give a good review on Unity as a development tool for virtual reality -> how well it visualizes data -> how collaborative -> how well it performs -> etc.

3/23/22 - 5 Hours

  • Finalized project ideas and decided on deliverables

  • Built my 2 slide presentation for tomorrow's class

  • Created a finalized proposal and linked it at the top of this journal

4/2/22 - 8 Hours

  • Started working on my new project over break

    • Learned how to copy an existing Unity project so that I wouldn't have to redo my Photon stuff from last project

      • Might add something in the wiki on this as it could be helpful to others

    • When copying a Unity project while using the XR toolkit you can get funny errors that can be resolved by simply disabling and reenabling the toolkit

      • This took me several hours to figure out since there is not very much info online about this. It is hard to tell you are even getting errors and will just not allow your headset to connect to the application

  • Prepared for the upcoming in class activities

4/6/22 - 7 Hours

  • Continued working on project and brainstorming ways to integrate my code from other projects into my new project as I have some helpful scripts for Marching Cubes on other projects

  • Prepared for the upcoming in class activities and filled the class document with my screenshots from old activities that I forgot to upload

  • Read the Bloom's Taxonomy pyramid diagram and thought of ways to incorporate it into my wiki contributions

4/11/22 - 5 Hours

  • Reviewed Bloom's Taxonomy pyramid diagram for in class discussion

  • Went through journal to fix missing links, hours, and update my links to give the best representation of my current journal

  • Class 4/12/22 Journal Review of Alastair

    • 5 - Journal entries are up to date

    • 4 - Journal activities are explicitly and clearly related to course deliverables

    • 5 - Journal entries demonstrate project progress (e.g. links, screenshots, or mentions of failure / success)

    • 5 - Deliverables are described and attributed in wiki

    • Overall Alastair's journal looks pretty good and is an accurate description of what he has done in this class. I also like his wiki contributions that he has done so far, as they are descriptive and also have visual components as well as links to other pages

4/13/22 - 4 Hours

  • Spruced up my wiki page called: Tutorial: Steam VR in Unity with Photon Pun 2. This page mixes aspects of Steam VR and Photon Pun 2 in order to give readers a foundation to build a multiplayer VR application. It is an intermediate tutorial with lots of words but overall is a pretty solid piece of writing to help people get started with multiplayer VR development.

  • Worked on my second project and implemented my marching cubes algorithm. This took a long time as it has features such as chunking and a complex density function that samples Perlin Noise.

  • Followed the steps to the tutorial mentioned earlier to reimplement my first projects multiplayer VR structure

  • Began brainstorming ways to implement colors for each player as well as collisions, and started thinking about what kind of avatars I want to have in game

4/18/22 - 6 Hours

  • Continued to work on my project so that I will have a decent product to show for my progress report

    • Implemented more features and completely revamped the color scheme to look like an underwater cave

    • Added an emissive point light object on the roof of the cave to give better visibility and increase the environment’s visual appeal

    • Added fog to the environment to make it seem more cave like and spooky

    • Kept working on my multiplayer code to reduce bugs and made weird haptic feedback less noticeable

    • Implemented movement in the direction that the camera is facing to give the player 360 degree motion

  • Took a nice video of my progress to show off what I have accomplished so far

  • Generated my slides to give to David which talked about what I’ve done and what I still need to work on

4/20/22 - 3 Hours

  • Prepped for Amanda’s in class activity with Immersion Analytics

  • Prepped for Mandy’s in class activity which is a manuscript analysis tool

  • Kept working on my project. I’m realizing that collisions between the player and the mesh cave environment is going to be very tricky given the way Unity collisions work. In order for a collision to be registered one of the objects must have a rigidbody on it, and they must both have colliders. If there is a rigidbody on my player the character controller freaks out so I have to give a rigidbody to each object I want my player to collide with. Since the mesh chunks are procedurally generated I have to generate a rigidbody after I build it and then include each triangle inside the rigidbody. Rigidbody colliders are capped at a small amount of triangles so I’m starting to realize that collisions with the cave will be more of a hassle than it’s worth. Most players will choose not to run into walls anyway.

4/25/22 - 2 Hours

  • Prepped for Tongyu’s in class activity which analyzes sound in Unity

  • Prepped for Sayan’s in class activity that analyzes MRIs in 3D

  • Dumped several more hours into trying to figure out a solution to my collision problem from last week. Tried several different ideas including rebuilding my whole movement system so that I could put a rigidbody on my character instead, but none of my options resulted in both accurate movement and collisions

4/27/22 - 4 Hours

  • Prepped for Shashidar’s in class activity

  • Prepped for Aakansha’s in class activity

  • Prepped for Lucia’s in class activity

  • Decided that player-cave collisions was a lost cause and started working on the other portions of my project

    • Worked on the game aspect and color portions of my multiplayer

    • Brainstormed ways to give each player a piece of the map to the cave

    • Decided on having there be a trail of breadcrumbs and giving each player a choice in the color breadcrumbs they can follow

    • If 5 people work together they can then find their way through the cave

5/02/22 - 6 Hours

  • Prepped for Beatrice’s in class activity

  • Prepped for Maya’s in class activity

  • Designed my map system I explained in my last journal entry

    • Each player can choose a color at the start of the round

    • That color will determine which bread crumbs the player can follow

    • I managed to make the colors sync across screens by making a script for my avatar that updates the Photonview component on the avatar with the color of the map object that the player collides with

    • This is a solution for giving players options for their color without requiring any GUI aspects or complicated menus

      • It also forces players to choose 1 of 5 colors so that they are forced to see 2/10 of the breadcrumbs

  • Looked into name tags for my program and watched several tutorials on how to implement them

    • They all needed some system to get the name input

5/04/22 - 8 Hours

  • Prepped for Jennifer’s in class activity

  • Plotted out the path that I wanted players to follow in the game and created the breadcrumbs to lead them through it

  • Added another light at the end of the path to make it seem more like a destination

  • Also added a treasure chest at the end for the same reason, this chest was a bit annoying to build since the primitives had to be placed precisely in order to make it look decenta

  • Built my avatars to look like little submarines

    • Mapped their movement to that of a player’s head that way they would sync up with what a player was doing with their body

  • Started testing my project I have been building the past few weeks

  • Got my little brother to download and the run the app to make sure all of the features worked, but the coloring didn’t so I spent several hours debugging it

    • The coloring wasn’t working because I had the collision detection and reaction algorithm on the player instead of the network player which is the object that is visible across each user’s device

    • I moved the algorithm to the network player and it worked like it was supposed to: when a player hits one of the emissive cubes at the start of the app, their avatar will change to the color of the cube. This will also activate the breadcrumbs associated with the color they chose

  • Built my game and shared it with the class via slack as a downloadable google drive folder

5/09/22 - 4 Hours

  • Created my flash talk slides for the whole semester

  • Created a presentation to give on project 2 for 5/10

    • I built this like I was thinking of doing my poster

    • It has a screenshot of my application as the background and then has blurbs around the outside detailing features, goals, and wiki contributions (which I still need to make and finalize the exact things that I will create)

5/17/22 - 5 Hours

5/18/22 - 5 Hours

  • Built the following wiki pages:

  • Made my final project poster in the CIT

  • Sent David and Ross my self reflection for the semester

  • Finalized my journal (hours/links/final posts)