Nicholas Bottone's Journal

PROJECTS

Final Poster.pdf

Implemented Wiki Contributions

Proposed Wiki Changes (1/31/22)

  • Add details about Unreal Engine 5 and more details about the engine's features to Unreal Engine page. (10 hours)

  • Expand details about Quest 2's camera-based Hand Tracking features, and how to use hand tracking in your apps (including Unity/Unreal tutorials). (10 hours)

  • Details and tutorials for Steam/SteamVR specific technologies, such as Steam Networking (for multiplayer/multiuser servers with auto-matchmaking without needing port forwarding). (10 hours)

  • Add more Unity 3D tutorial subpages with more detailed tutorials on using VR-specific control inputs such as touch controllers. (1 hour)

  • Using multiplayer/multiuser networking to allow cross-play/cross-collaboration between VR and non-VR users. (1 hour)

  • Add details to the VR Hardware category about the new HMDs that have recently come out, including the HTC's Vive Focus/Pro/Cosmos/Flow. (1 hour)

  • Add more techniques for preventing cybersickness when in motion, such as a digital helmet to reduce FOV or a wireframe of a vehicle surrounding you. (10 minutes)

  • Update software compatibility matrix for VR Hardware HMDs and add a product hardware comparison matrix. (10 minutes)

  • Add details about Facebook Connect 2021 and Meta's Metaverse Plans with Horizon to VR + Social Media page. (10 minutes)

HOURS SUMMARY

Total: 159 hours

Journal

1/30/22 - 2 Hours (2)

  • Review wiki resources, read course information and existing wiki pages from previous years (including "What is VR?", etc.)

  • Introduction on Slack and setup personal journal page

  • Start taking notes on potential improvements to the wiki pages

1/31/22 - 2 Hours (4)

  • Continue taking notes on potential improvements to the wiki pages

  • Researched Meta's Horizon Worlds and Horizon Workrooms

  • Added details about Facebook Connect 2021 and Meta's Metaverse Plans with Horizon to "VR + Social Media" page

2/2/22 - 4 Hours (8)

  • Setup Quest 2, Oculus software on PC, and Paperspace VM platform

  • Learn about pieces of development software that could be later evaluated; come up with potential project ideas

  • Potential Software to Evaluate:

    • BlenderXR, Unity, Unreal Engine, Maya

  • Potential Project Ideas:

    • Representing a git version control tree in three-dimensional space, with the ability to navigate through historical versions and dependencies that the project relies on. Code collaboration in VR.

    • Visualizing and representing sensor data or models of planetary surfaces or anomalies in space with readings from NASA/SpaceX/etc.

    • Visualizing how the atmospheric conditions and sensor data changes during storms; visualizing predictive models.

    • Evaluating different software's ability to represent CAD designs for engineering projects when multiple engineers are collaborating on an assembly or part, being able to represent designs in scale.

    • Visualizing more than three-dimensional graphs in mathematics (i.e., 4D Toys, etc.).

2/7/22 - 3 Hours (11)

  • Finished setup of Virtual Desktop on both my personal desktop and the Paperspace server. Experimented with optimal Virtual Desktop settings, especially Synchronous Spacewarp (SSW) and other quality options to minimize latency.

  • Experimented with Google Earth through Steam VR over Paperspace streaming.

  • Installed DinoVR on local desktop and on Paperspace VM.

  • Continued research and brainstorming for project ideas.

    • Global Information Systems (GIS)

      • The ability to analyze timelapses of different things going on across the globe

      • Analyze production increases, immigration and population differences, the spread of diseases across land area, etc. over a timelapse to determine

      • Timelapse of both data and of satellite imagery to identify correlations of properties that changes as things develop.

    • Cities Skylines -style traffic simulation. Visualizing and modeling traffic in metropolitan areas and how different traffic structures can work differently in various locations.

    • Political maps – analyzing how candidates can win each county and each state to win the electoral college.

    • Aggregating data about celestial body locations, modeling planetary surfaces or other anomalies in space.

2/9/22 - 4 Hours (15)

  • Took DinoVR screenshots and posted to the board.

  • Selected my favorite project idea: the "Cities: Skylines" and did research and planning for this project.

    • Cities Skylines -style traffic simulation. Visualizing and modeling traffic in metropolitan areas and how different traffic structures can work differently in various locations.

    • Analyze rush hour traffic design patterns such as toll roads, HOV3 (high occupancy vehicle 3+) lanes, reversible lanes, etc.

      • Determine the effectiveness of specific types of traffic patterns when used in specific configurations.

      • Implement models of various different traffic structures

      • Include real world traffic data and/or simulated traffic data

    • Proposed In Class Activity: Customize your own intersection or road or other structure to mess with the simulation/data

    • Proposed Deliverable: Wiki pages about VR simulation software.

  • Proposed Project Milestones:

    • 2/15 -- Finalize the software that I will be using for the visualization, and create an empty project in that software

    • 2/17 -- Get the empty (or almost empty) project to run on the VR headset

    • 2/22 -- Get the first models / real 3D objects into the project and to show up on the headset

    • 3/01 -- Full 3D environment is visible / imported to project

    • 3/03 -- First logic / simulation / animation is visible inside the headset

    • 3/08 -- Multiple simulations / animations are visible inside the headset

    • 3/10 -- Multiple traffic structures are configurable / customizable inside the headset

2/16/22 - 4 Hours (19)

  • Re-evaluated flaws with past project ideas to identify what needed to be addressed for my project proposal

  • Communicate with Ross about reconstructing my project idea... re-brainstorm traffic data project idea

  • Journal self-evaluation:

    • Journal activities are explicitly and clearly related to course deliverables: 4

    • deliverables are described and attributed in wiki: 3

    • report states total amount of time: 4

    • total time is appropriate: 4

  • Wrote new project plans and proposal. Proposal is linked at the top of this page.

2/17/22 - 3 Hours (22)

  • Continued editing project proposal.

  • Created project plan PowerPoint presentation. Presentation is linked at the top of this page.


Peer Journal Evaluation:

  • Journal activities are explicitly and clearly related to course deliverables: 4

> Detailed proposal

> Links at top

  • deliverables are described and attributed in wiki: 3

  • report states total amount of time: 4

  • total time is appropriate: 3 (~30 hours is staying on track)

By: Ross Briden

2/22/22 - 3 Hours (25)

  • Downloaded ArcGIS and ArcGIS Pro from Brown IT software catalog

  • Downloaded Unreal Engine 4 from Epic Games Launcher and downloaded City Engine project template

2/25/22 - 2 Hours (27)

  • Explored tutorials for ArcGIS Pro and attempted to follow along

    • Setback: Realized that despite downloading from Brown software catalog, I need to reach out to someone to get a licensed account for ArcGIS Pro

    • Reached out to get my ESRI account connected to the Brown University organization

  • Explored online resources on Forum8 Urban Planning VR Studio

    • Came to the conclusion that ArcGIS was definitely well-suited for my target project idea and I should focus on ArcGIS instead of focusing on Forum8 anymore

2/27/22 - 5 Hours (32)

  • After getting my ESRI account connected to the Brown University organization, I could actually follow along on the Quick Start tutorials in ArcGIS

    • Initially focused on 2D and non-VR projects to get the gist of how to work with map data and how to work with Excel spreadsheet data

    • Followed the ArcGIS Pro tutorial for displaying a population density heat map and comparing it to city bus routes. Also followed the tutorial for visualizing the flood risk in an urban area, which was a 3D visualization on the desktop.

  • Linked Unreal Engine 4 editor to Visual Studio 2019 and installed the necessary .NET C++ Frameworks.

  • Launched the City Engine UE4 project template in the UE4 editor.

    • Needed to debug for a while to figure out why the project was not compiling. UE4 was not detecting the .NET Framework initially, so I needed to run a tool inside of VS2019.

    • The UE4 editor clearly shows the City Builder environment on my PC and just needs a City Engine file to be imported in order to run on a VR headset.

2/28/22 - 5 Hours (37)

  • Used and experimented with "ArcGIS 360 VR" in the Oculus Browser running natively on the Quest 2.

    • This tool is incredibly easy to use and simple compared to the desktop versions of ArcGIS Pro. It was essentially instant to open on the HMD in the browser (no installation necessary).

    • It appears that this tool is too basic to do any extensive research work besides visualization, and it appears to lack collaborative features. I will have to look further into whether ESRI offers any other online web-based VR apps that include collaboration, since a basic solution like this may be ideal for a class activity due to its lack of need for installation.

  • Created PowerPoint presentation with project progress update for class on 3/01.

  • Read into the documentation on the differences between ArcGIS Pro and ArcGIS City Engine.

    • ArcGIS Pro is closely associated with City Engine and share many file types, despite them not being technically the same product. It appears that you can export and convert files between the different apps in the event this is necessary.

  • Read into the documentation for the UE4 VR Experience template.

    • The setup of the template in the UE4 engine is essentially complete. All that needs to be done is import a City Engine file before I will be able to preview the app in VR.

    • There are a significant number of advanced features in the UE4 template that may not be necessary to mess around with given how advanced ArcGIS Pro already is.

  • Planning / re-thinking about project deliverables:

    • Attempted to work on starting ArcGIS Pro tutorial (or at least documenting my learning process on a wiki page) but ran into connection issues on Google Sites :(

    • Class Activity: The ideal solution for the activity may be to compile/export the UE4 project, that way the class can use this compiled UE4 project collaboratively for the activity without needing to install ArcGIS Pro or UE4 editor. This can speed up the setup process and improve the experience, since I can now confirm with certainty that ArcGIS should be able to satisfy all my desired functionality for my activity. The UE4 project will be necessary unless I can find some web browser-based solution that meets my ideal criteria (as described above that may not be possible).

3/2/22 - 2 Hours (39)

  • Took screenshots for Google Earth VR Activity (since screenshots were missing from earlier).

  • Completed Beatrice's UNH Point Cloud activity and submitted feedback form (since Paperspace was malfunctioning during class).

  • Setup Virtualitics in advance for Amanda's activity.

3/5/22 - 2 Hours (41)

  • Watch YouTube tutorials on using ESRI CityEngine

3/6/22 - 2 Hours (43)

  • Attempt to find license for / install CityEngine, which was previously thought to be included within the ArcGIS suite

3/7/22 - 2 Hours (45)

  • Call ESRI Customer Service phone line. Confirm that the educational license does not include CityEngine. Apply for a "for-profit business free trial" for CityEngine.

  • Download/install CityEngine. Import ArcGIS files from last week into CityEngine.

3/11/22 - 2 Hours (47)

  • Watch YouTube tutorials on CityEngine. The most recent official video tutorials that were made were from 2014 and were severely dated. Unofficial tutorials were not much better.

  • Switched to reading text tutorials on the ESRI website.

3/12/22 - 4 Hours (51)

  • Download and import example CityEngine projects, including Venice and a couple fictional cities.

    • Download satellite imagery from OpenStreetMap for Venice and map it to the CityEngine project.

  • Export CityEngine models to Unreal Engine Datasmith format. This took a few attempts since you need to export the models not the project.

    • This took some experimentation since you needed to export the correct layers, otherwise you might not see anything in Unreal or you might see too much covering the important stuff. I had to carefully select all the layers I wanted to export before entering the export model dialog within CityEngine.

  • Import Datasmith files into Unreal Engine project. After the import, I attempt to view a preview of the project in simulation mode on the desktop (outside of VR). Since this is a VR project, I could not move around very well so I quickly aborted and switching to trying in VR headset.

  • When streaming over Virtual Desktop, the Unreal Project had an unusable framerate. This was despite the fact that Virtual Desktop was working fine for all other apps. I ended up needing to use an Oculus Link cable and force quitting SteamVR to get a playable framerate. (I used Oculus app instead of SteamVR, even though SteamVR should be compatible. I will look into this more before my class activity.)

  • In VR, I was able to teleport around the "meeting room" and see what was supposed to be the map on the table. I could tell it was the map, and I could interact with it (pan, zoom, rotate) but it was mostly black.

3/13/22 - 4 Hours (55)

  • Discovered that if you bent your head down and zoomed into the map in VR enough, you could see the 3D representation/map of Venice underneath a flat black layer. This must have been the result of me selecting too many layers to export from CityEngine. This was still a breakthrough moment though, since I realized I could see the full interactive 3D map inside of VR.

  • Since I did not have datasets downloaded for Venice, I decided to attempt to download OpenStreetMap data for New York City. Discovered that OpenStreetMap dataset had many more options that could be enabled, but it was unclear what most of them were or how useful they would be. Might be worth experimenting with these options later. The ones I enabled were "highways" "walkways" and "buildings".

    • I ended up disabling the "highways" and "walkways" layers since they looked scuffed. When disabled, streets were more difficult to see due to the satellite imagery, but the other view was too messy to be acceptable.

    • 3D buildings showed up perfectly. All the 3D buildings were the same color and did not have any material / different coloring at this time.

      • It may be ideal to set different buildings to different colors to display a particular dataset related to buildings in the area.

  • The size area that I downloaded from OpenStreetMap (and also the size of the CSVs that I downloaded from NYC Open Data) was too large to have acceptable performance on CityEngine or Unreal Engine. I ended up narrowing down the scope to Central Park since there was a significant green area surrounding by a lot of buildings, allowing for some potential variety in the visualizations I am able to do.

  • I discovered that expanding the "footprints" tab on the scene explorer allows you to view the full list of models in the scene, each of the models named after what building it is part of. This may allow for some scripting process to filter out certain buildings or edit certain buildings in specific ways (color coding, etc.)

3/14/22 - 7 Hours (62)

  • Messing around with CityEngine desktop to figure out how to write scripts and/or import CSV files.

    • Tutorials mention that you can import CSV files, but the instructions are incredibly unclear. Significant experimentation was necessary for this step.

    • A scripting menu allows you to import Python files, which seems promising, but again I can find essentially no documentation on any SDK or way of scripting anything I am interested in with Python.

  • A lot of time was spent trying to find and/or understand tutorials on importing ArcGIS Pro datasets/projects into ArcGIS CityEngine. Though the website clearly shows that they are part of the same software suite and package and should share functionality, it appears that it is potentially impossible to bring project data from ArcGIS Pro into CityEngine. This is unfortunate since ArcGIS Pro has a significantly better interface for interacting with datasets (CSV files, etc.) but only CityEngine can be used in VR.

  • Created wiki page for ArcGIS CityEngine.

    • Planned to also make a page about ArcGIS Pro, though currently that content lives on the same page. Might expand it into two pages in the future.

  • Cook the first compiled Unreal Engine executable for the NYC Central Park map. The cook build failed with error "EXCEPTION_ACCESS_VIOLATION" and I did not have enough time to debug it. Next steps after getting the package to actually cook are to setup the multiplayer server functionality.

3/15/22 - 6 Hours (68)

3/20/22 - 4 Hours (72)

  • Briefly researched Photon Engine's Realtime for Unreal Engine. This ended up not looking feasible since I am not familiar enough with Unreal Engine's networking nor C++ to be confident enough making such a sizable code contribution to an unfamiliar codebase.

  • Installed ngrok to experiment with tunneling and attempt to make a multiplayer connection outside of the local network.

  • Find where the sockets are being setup in the ArcGIS Unreal Engine code. This took a long time since I don't have much Unreal Engine code experience nor C++ experience. I sought out to change the range that was being scanned to be a specific hostname (the ngrok address).

  • Debug EXCEPTION_ACCESS_VIOLATION which re-appeared when attempting to compile to binary form again.

3/21/22 - 4 Hours (76)

  • Read the Seven Scenarios Paper.

    • My project 1 relates to the seven scenarios most through the Communication through Visualization (CTV) scenario and through the Collaborative Data Analysis (CDA) scenario. My project 1 deliverables focus on taking in data and being able to present/communicate it in different ways on the OpenStreetMap views, and the deliverables also focus heavily on group analysis and group sensemaking.

  • Project 2 Brainstorming/Ideas:

    • A lot of my project ideas stems from the fact that I don't want to work with Unreal Engine or ArcGIS again, and I would rather stem into WebXR-based projects. I was inspired by Jakobi's work with A-Frame and socket.io to build VR apps that can run natively in the web browser, so I might seek to use his resources from his first project.

      • I did some reading into aframe, aframe-react, and took a look at Jakobi's project 1 GitHub repositories to learn more about the frameworks.

    • Write draft project proposal for "Using WebXR to Visualize Map Data" project 2. (The proposal document can be found linked at the top of this document).

      • This project will visually examine temporal map data, such as population demographic changes over time from the U.S. Census or the correlation between city infrastructure (such as building zoning or transportation volume) and environmental indicators (such as average temperatures or ozone levels). For the latter analysis, this allows us to gain insight into phenomena such as the Urban Heat Island Effect, as well as other correlations that may affect the environment as measured in that area.

      • Users will be able to stand and walk around on top of a room-scale map in a shared environment with other collaborators. Users should be able to see each other’s avatars in the shared environment and navigate across the map.

      • This project would include some novel code creation based on existing WebXR and server frameworks. The project should be generic enough to be reused with many data sets that follow the same basic format (map format with bar graphs, etc.). I anticipate this project will visually look like either bar graphs vertically rising out of states/territories on the map, or heatmaps on the map surface.

      • Software:

      • Wiki Deliverables

        • Contribute further to the A-Frame and Socket.io wiki pages

        • Work with Jakobi to continue building the open source wsmulti project

      • In-Class Activity

        • Participants will logon to the web address that the WebXR client is hosted at. Participants will join the shared environment, identify their partner within the room, and reproduce a couple screenshots taken from various positions. At the conclusion of the activity, a feedback form will collect the user’s perceived usefulness and suggestions for improvement.

      • Seven Scenarios

        • Communication Through Visualization (CTV) - This project takes in temporal map data and can communicate and present it to the user in VR. We can analyze if people learn better/faster using this tool compared to other tools, and whether useful information can be extracted from the visualizations.

        • Collaborative Data Analysis (CDA) - This project supports collaborative analysis and work in a shared environment. We can analyze how intuitively users are able to interact with and identify each other, as well as whether the social exchange in VR makes the learning better/faster.

3/23/22 - 2 Hours (78)

  • Finish planning Project 2 deliverables and milestones

  • Create Project 2 proposal presentation

3/24/22 - 1 Hours (79)

  • Research using Danfo.js for pre-processing data in JavaScript

4/1/22 - 8 Hours (87)

  • Install development tools on my home PC (I was at home this week due to spring break)

  • From Unreal Editor, I found out that I could link my Unreal project to Visual Studio 2022 to compile the C++ code in VS since the build was failing when compiling from Unreal Editor.

    • I found it was necessary to "Build > Build Solution" in VS2022 before "File > Cook Content for Windows" on Unreal Editor. Then I was able to package the project to an external executable.

  • Created a new empty project with the base CityEngine template to simplify the learning process of packaging (in case something with my Datasmith files was buggy/wrong, since at this point I was encountering lots of packaging errors and build failures).

  • Research how to package projects to executable from Unreal Editor.

    • I learned about the configuration settings under "Project Packaging Settings". I had to experiment with using "Pak files" and different build configurations including "Debug" "Development" "Test" and "Shipping".

    • Each time I experimented with modifying the packaging settings, I needed to recompile/recook/rebuild the project to see if the package was successful. This process took several minutes for each attempt.

    • In the end, the final packaging settings I found to work were: Use Pak File, Shipping mode, Full Rebuild, For Distribution, Include Prerequisites Installer.

  • After successfully packaging the project, a folder called "WindowsNoEditor" was generated with the packaged game for Windows 64-bit, allowing me to run the project on any machine (even those without Unreal Editor or development software installed).

  • The packaged folder consisted of many files and totaled over 1GB, so I researched how to use 7-Zip to create an SFX archive (self-extracting archive executable). 7-Zip allows you to create an EXE "installer" that automatically extracts itself similar to a real installer, using the more compact 7z format (instead of zip) without requiring 7-Zip installed on user machines.

  • Install the WindowsNoEditor package with the SFX archive on the Paperspace machines (which don't have Unreal Editor) to confirm that the packaging process worked!

  • Attempt to get the multiplayer server to launch

    • Searched for all files in the source code in Visual Studio related to multiplayer or networking. Discovered that there are no files specific to the CityEngine project template for multiplayer or networking, it is all implemented by the Unreal Engine's "Multi-User Editing" plugin.

    • Researched the Multi-User Editing plugin and found configuration settings related to "UDP Messaging" under the Unreal Editor's "Project Settings".

    • Experimented with hard-coding the Unicast Endpoint and Multicast Endpoint to the IP address of the server.

    • Experimented with configuring Static Endpoints for the server and the individual client according to the (admittedly confusing) documentation on the Unreal site.

    • Experimented with manually editing and adding lines to the "DefaultEngine.ini" file in the Visual Studio project. Attempted to hard-code the server IP address in this ini file as well.

    • In the end, the app worked perfectly in single player, but I was never able to get a client to join a server, or even to acknowledge that I was attempting to make a connection (no messages or logs were ever displayed on the client). When the host started the server, the room was reset to default configuration and Windows Firewall displayed a message prompting to allow access (which was promising).

      • Unfortunately the client interface provided by the Unreal Multi-User plugin is designed to be automatic and "magic" by discovering the server on the network using multicast/unicast. There is no menu when connecting from the client side, and there is no option to manually enter an IP address, and there are no log messages or errors displayed. This made the process nearly impossible to debug.

  • I eventually found a 10-month-old Reddit post on r/UnrealEngine that pointed out it was impossible to use the Multi-User Editing plugin on separate LANs (what I was attempting to do). The Redditor suggested using VPNs to simulate a local connection. I decided to attempt using a VPN tomorrow.

4/2/22 - 9 Hours (96)

  • Research setting up an OpenVPN server. The VPN server should allow everyone to install a VPN on their Paperspace machine, allowing all the machines to appear as if they are on the same LAN, hopefully allowing them to discover each other.

  • Set up an OpenVPN server on my local home network with PFSENSE and UDP4.

  • Create packaged installers that automatically install the OpenVPN client and configure the client to connect to my home VPN server, so these VPN clients can be distributed to the class for the in-class activity.

  • Set up SSL certificates and user accounts for the OpenVPN clients since those are necessary if you wish to allow connections from WAN.

  • Install and configure the OpenVPN clients on my Paperspace virtual machines. Test that I am still able to connect to the machines via Virtual Desktop in VR even while connected to the VPN.

    • Getting Virtual Desktop to work while the Paperspace machine was connected to a VPN was challenging -- I got the "Computer is unreachable" error at first. The FAQ says Virtual Desktop may not work while on a VPN. I played around with configuring a DMZ host on the router and an Open NAT in the networking settings, then I was eventually able to get Virtual Desktop to work again.

  • With both Paperspace machines connected to my VPN, the machines should hypothetically be on the same LAN and subnet (my home network LAN), and hopefully they are able to discover each other and the client can connect to the server.

    • The result was the same that I saw yesterday -- the server was able to start and prompted with the Firewall message, but the clients had absolutely no feedback when I pressed the join button. I tried messing around with the OpenVPN settings and trying with full auto discovery as well as with hard-coded IP addresses pointing to the server machine. None of these attempts were successful at getting the clients to show up on the server.

  • I eventually had to give up on getting the multi-user functionality working, and instead worked on documenting my processes on the wiki and what did/didn't work for me along my attempts.

  • Created a wiki page documenting the process I used to package my Unreal Engine project to Windows 64-bit.

  • Created a wiki page documenting the process I used to create a self-extracting archive executable with 7-Zip.

  • Created a wiki page documenting some of my attempts (and ultimate failure) to use Unreal Engine's Multi-User plugin.

  • Began making a wiki page for setting up an OpenVPN server, but ended up not finishing it after realizing it did not have much relevance to most VR projects (and many good tutorials on the internet already exist).

  • Set up a GitHub organization for the "A-Frame Alliance" for all those that plan to collaborate on A-Frame tools for project 2.

4/3/22 - 3 Hours (99)

4/12/22 - 3 Hours (102)

4/13/22 - 3 Hours (105)

  • Copy over Jakobi's wsmulti-client and wsmulti-server into the VRWiz project.

  • Convert the VRViz backend to TypeScript and start building the frontend in Vite. Setup the HTTPS service for the backend.

4/14/22 - 4 Hours (109)

  • Convert the frontend VRWiz project to TypeScript and debug getting the page to open with Vite.

  • Setup an NPM script to automatically setup HTTPS for both the frontend and backend development servers (as well as the backend production server).

  • Continue setting up a shared GitHub repo for the classmates using A-Frame.

4/17/22 - 2 Hours (111)

· Setup Cloudflare Pages to automatically deploy frontend on production (main branch) as well as preview builds on PRs

4/18/22 - 7 Hours (118)

· Setup GitHub Actions workflow to automatically build frontend and backend

· Begin setting up GitHub Actions workflow to automatically deploy backend over SFTP

· Research setting up a self-hosted node server using PM2

· Attempt to start converting VRWiz repo to use aframe-react so VRViz can be integrated (and because I like React)

o Create Vite React app and research aframe-react and alternatives

o Create basic scene in @belivvr/aframe-react

4/18/22 - 7 Hours (118)

· Setup GitHub Actions workflow to automatically build frontend and backend

· Begin setting up GitHub Actions workflow to automatically deploy backend over SFTP

· Research setting up a self-hosted node server using PM2

· Attempt to start converting VRWiz repo to use aframe-react so VRViz can be integrated (and because I like React)

o Create Vite React app and research aframe-react and alternatives

o Create basic scene in @belivvr/aframe-react

4/19/22 - 2 Hours (120)

· Create PowerPoint for Project 2 Check-in with progress report

· Fix VR button not working in React app

o By default, all the React app contents will go inside a root div, but AFrame’s CSS does not like this – so we can tell React to put the contents inside the document body instead of the root div.

4/19/22 - 2 Hours (120)

· Create PowerPoint for Project 2 Check-in with progress report

· Fix VR button not working in React app

o By default, all the React app contents will go inside a root div, but AFrame’s CSS does not like this – so we can tell React to put the contents inside the document body instead of the root div.

4/20/22 - 4 Hours (124)

· Setup Ubuntu server VM with SFTP server and NodeJS server to host the backend

· Finish the GitHub Actions workflow to target this new server VM and automatically deploy the backend after every push to the main branch

· Debug some issues related to NVM (node version manager) and SSL certificates (HTTPS is required for VR web apps)

4/27/22 - 3 Hours (127)

· Add “Hello World” express route to backend to more easily detect when the backend is online

· Add, debug, and fix environment variables used to tie the frontend to the backend

o (Vite requires that environment variables that are client visible be prefixed with “VITE_”)

5/2/22 - 4 Hours (131)

· Experiment with improving the visuals of the environment by playing with skyboxes and ground images

· Update the types and interfaces to include attributes like usernames and colors for each player

· Update the backend server to receive and distribute these new user objects with the usernames and colors

5/4/22 - 10 Hours (141)

· Set hand cones on the frontend to display the color on each user object.

· Write a random color hashing algorithm – hash the player’s name and turn that hash into a specific color. This is a deterministic function – the same username will always produce the same color.

o I experimented with using the socket ID to determine the color instead, but I decided that I wanted colors to persist across sessions.

o I experimented with allowing truly random colors, but I decided to choose randomly from an array of around a dozen colors. This allowed the colors to be “not ugly.”

· Add an avatar head that synchronizes with the headset position and rotation.

o I gave each head a basic face that was modeled with five spheres: a sphere for the head, a white sphere for each eye, and a black sphere for each pupil. I played around with the scaling and positioning of these spheres until the avatar looked satisfactory.

o I had to add this head position to the user object to be synchronized with all the other clients through the backend, so the other clients could see the position of everyone else’s heads.

· Added aframe-extras: an aframe library that allowed me to add locomotion with the joystick controls.

o I needed to surround the avatar component in the DOM with a div that I called rig. When the user used the joysticks to move around, I would change the position of this “rig” div, which would in turn move the controllers and head around together.

o I needed to make a function to get the absolute position of an avatar to combine the position of the rig with the relative position of the head within the rig. This absolute position would be necessary for the other users to display accurate positions on the client.

§ I spent a lot of time working on debugging this conversion from relative position to absolute position – this was difficult since the layout of the rig with the head was different on each device.

· Add a nametag to display the username above the avatar.

· Add a plane to the floor with a map chart of population growth. Experiment with the ideal position and size of this map chart.

· Add a raycaster to both controllers to allow users to point a beam / laser pointer from their hands.

· Add hyperlinks and link traversal to other web VR projects! These show up as portals in the environment.

o The link traversal controls are finicky, and it is a little bit confusing. As a result, I may end up needing to disable it if it gets too distracting.

· Add extensible bar graph framework!

o I created a basic library that allows someone using the VRWiz project to import data as a JSON file and easily display it using bar graphs without having to write any code for aframe components. The user of the library can place an automatically scaled bar at any coordinate within the scene.

o I could eventually package this library as a separate NPM package if I desired, though for now it remains tied to the VRWiz repository.

o This can eventually be extended to put bars on spots on a map at the location of states or counties.

5/5/22 - 2 Hours (143)

· Attempt to fix/improve the absolute positioning of cones/controllers relative to the rig body

· Create Google Form for in-class activity

· Create wiki page with instructions, controls documentation, and screenshots for in-class activity

5/14/22 - 2 Hours (149)

· Catchup on recent journal entries

· Created wiki page with JavaScript WebSocket API tutorial <Custom JavaScript Multiplayer with Web Sockets>

5/17/22 - 4 Hours (153)

· Finished wiki page with WebSocket tutorial
· Created wiki page with locomotion tutorial
<Locomotion with aframe-extras>
· Improved / expanded on wiki page with SideQuest tutorial <Sideloading with SideQuest> -- Added details on enabling developer mode for the first time and added screenshots to every step

5/18/22 - 6 Hours (159)

· Created wiki page with APKPure explanation and tutorial <Sideloading with APKPure>
· Created reflection page with opinionated analysis of my project 1 <
Nick's Project 1 Reflection>
· Created reflection page with opinionated analysis of my project 2 <
Nick's Project 2 Reflection>
· Polish demo of VRWiz based on some feedback following my in-class activity
· Finished designing and printed final poster!