Alastair Beeson's Journal

PROPOSALS

Project 1:

Project 1 Proposal: Proposal

Presentation for Project 1 Proposal: Presentation

Progress Presentation for Project 1: Presentation

End Presentation for Project 1: Presentation

In-class Activity: Activity

In-class Activity Data: Data


Project 2:

Project 2 Proposal: Proposal

Presentation for Project 2 Proposal: Presentation

Progress Presentation for Project 2: Presentation

End Presentation for Project 2: Presentation

In-class Activity: Activity


Demo Day:

Poster: Poster

Public Demo: Demo

HOURS SUMMARY

Total: 151 Hours

HOURS journal

Week 1

1/31/21 - 6 Hours

  • Set up my journal page

  • Joined Slack

  • Reviewed Course Homepage

  • Identified Changes:

    • 10 Minutes:

      • Move the wiki logo and header: I am not entirely sure if this is a compatibility issue, but on my computer, the wiki logo and header take up a significant amount of space and push all of the links down the page. This makes it so that you have to scroll down to reach the course's homepage. I prefer the way the wiki displays in a smaller window where there is a header/hamburger menu combination.

      • I feel like this is a simple change but also belies the challenge of having a wiki that functions as primarily an informational resource but also a course page. I find it a bit frustrating to access course information. This could be as simple as moving the tab for Course Homepage but I feel like that could hurt the wiki. Perhaps there is a more complex and time intensive way.

      • Several of the buttons on the homepage don’t work. For instance, VR visualization tutorials and Intro to WebVR. These should be fixed so that users can access those resources. (Fixed this one)

      • 1 Hour:

        • There should be more pages for potential applications of VR. They don’t all need to be heavily fleshed out, even just a paragraph or two. I feel like there are more fields than just those that are present now and several of the entries seem rather arbitrary. This could easily turn into a 10 hour task if someone decides to really add to a page (see the performing arts entry).

        • In several of the sections, there is a drop down entry called “Tutorials” buried amongst other content. I am wondering if these could all be moved to a separate tutorial section. I feel like as it is organized now, wiki readers would struggle to find these among the slew of other entries. Perhaps making a header for VR tutorials would help with visibility and accessibility.

        • I would be interested in making a landing page to aggregate arts related VR technology. I’m sure there is more beyond VR Art Gallery Software.

      • 10 Hour:

        • This could easily bloat in terms of scope, but potentially rethinking the homepage of the wiki and the way information is presented. I feel like many of the sections are missing content now present on the site. While I like the “Where do I start?”, it could be moved up for better visibility and also reworked to find what information/links are most pertinent for someone accessing the wiki. I feel like since this is perhaps the most important page on the website, this should be an area of focus and iteration.

        • This is delving more into the realm of UI/UX Research, but potentially employing Eye Tracking and User Testing. While the wiki is definitely a resource for researchers and enthusiasts, we should also consider how users not in the class would interact with the site. I think this could be an interesting window into how the wiki works as a resource.

        • Potentially adding a better “on boarding” / VR Education experience. What I mean by adding multimedia content like videos/images/graphics to help explain and reinforce concepts. If this is not really suited for a wiki, perhaps making a page with links to great informational videos/infographics. I know that people absorb information in different ways and a holistic approach beyond merely text could better educate visitors.

  • Read up a bit on different VR headsets available and their features

  • Explored the debate about VR exclusives, challenges in VR software development and how VR projects are funded.

  • Thought: Is capitalism/competition already affecting the course and accessibility of VR?

  • Definitely want to explore VR ethics a bit more

  • Revisited the Hololens and its potential applications

  • Read about the “Integrated Visual Augmentation System”

Week 2

2/2/2022- 2 Hours

  • Completed Quest 2 Setup Tutorial

  • Researched Potential Software

  • Unity

  • Unreal Engine

  • Paraview

  • Google Earth VR

  • Project Ideas:

  • Using vehicle and geographical data to visualize and analyze urban infrastructure. Could try Providence.

  • Visualizing NBA scoring from shot chart. Allows for analysis of different players, time periods, play style.

  • Using Twitter data to perform sentiment analysis in VR, seeing how people's feelings change over time


2/6/2022 - 3 Hours



2/7/2022 - 5 Hours

  • Searched the web for existing data sets

  • Spent some time on Kaggle searching and trying some exploratory data analysis

  • Set up Dino VR

  • Project Ideas Expanded:

  • Vehicle Data:

  • Things that could be done during the project:

  • Visualize a map of Providence in VR

  • Visualize vehicle data over the map

  • Evaluate areas for traffic congestion, environmental impact from emissions

  • Class Activity:

  • Everyone loads up the map of Providence and tries to find an area of the city where high vehicle use could have traffic or emissions implication

  • Deliverables:

  • A wiki page discussing how to use vehicle data

  • Evaluation of how a software performs with geographical data in VR

  • NBA Data:

  • Things that could be done during the project:

  • Visualize the shots of different players

  • Visualize how a player's shooting evolved over time

  • Immersive visualization of basketball court

  • Explore how traditionally 2d shot data could be translated to 3d and VR

  • Is there more data that could help this?

  • Class Activity:

  • Everyone loads up a dataset of a player and identifies a trend. Either a player started shooting more 3 pointers, took more shots, favored one side. Then they postulate or research why that would be and explain.

  • Deliverables:

  • Tutorial on using NBA api with VR

  • Wiki page on what data types/data would be needed to transfer from 2d to 3d

  • Tweet Data:

  • Things that could be done during the project:

  • Learn how to use twitter API

  • Created a dataset using twitter API

  • Use sentiment analysis python libraries to evaluate this data

  • What kind of metrics should I have?

  • Class Activity:

  • Explore a specific data set and find tweets with different sentiments, dates, subjectivity

  • Deliverables:

  • Tutorial on using Twitter API

  • Wiki page on sentiment analysis and how to perform it



Week 3

2/10/2022 - 3 Hours

  • Potential Project Plan:

  • Project: To investigate using Python data visualization libraries with VR Software. Particularly Numpy, Pandas and Textblob. The data for this project would be Tweet Data. The goal being to visualize sentiment analysis in VR. For the particular VR software, I am aiming to use Paraview since there are existing tutorials, but I may also explore VTK instead if I run into challenges.

  • 2/15: Create and Evaluation Final Project Plan

  • 2/17: Aggregate all existing documentation necessary for my project into one document potentially to be added to the wiki

  • 2/22: Get all software installed and try to load Numpy. Also find a twitter data set.

  • 3/01: Get Pandas/Textblob running and try to load a data set

  • 3/03: Decide how data is displayed and create visualization

  • 3/08: Explore adding collaborative elements

  • 3/10: Fully functioning data visualization of tweet data with sentiment analysis in VR

  • Researched the Python libraries I wanted to use for this project



2/14/2022 - 3 Hours

  • Researched NREL

  • Added questions for Kristi

  • Project Evaluation:

  • Project: To investigate using Python data visualization libraries with VR Software. Particularly Numpy, Pandas and Textblob. The data for this project would be Tweet Data. The goal being to visualize sentiment analysis in VR. For the particular VR software, I am aiming to use Paraview since there are existing tutorials, but I may also explore VTK instead if I run into challenges.

  • 2/15: Create and Evaluation Final Project Plan

  • 2/17: Aggregate all existing documentation necessary for my project into one document potentially to be added to the wiki

  • 2/22: Get all software installed and try to load Numpy. Also find a twitter data set.

  • 3/01: Get Pandas/Textblob running and try to load a data set

  • 3/03: Decide how data is displayed and create visualization

  • 3/08: Explore adding collaborative elements

  • 3/10: Fully functioning data visualization of tweet data with sentiment analysis in VR

  • The proposed project clearly identifies deliverable additions to our VR Software Wiki: Agree. I feel this project could produce a detailed tutorial about how to use Python libraries in VR and a sample project.

  • The proposed project involves collaboration in VR: Neither Agree nor Disagree. I feel this is the biggest challenge and will need to explore further on how to add collaborative elements.

  • The proposed project involves large scientific data visualization along the lines of the "Scientific Data" wiki page and identifies the specific data type and software that it will use: Agree. This project's data qualifies as "Network Data". The data type is simply tweets, users and information. I have also identified the specific software I will use in Paraview/VTK, PvPython, and VirtualEnv.

  • The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class: Agree. I feel this is realistic and achievable in the timeframe. My current schedule allows room for further expansion if I am ahead of schedule.

  • The proposed project includes an in-class activity: Agree. If I am able to load and visualize data in VR with Python, It should be straightforward enough for members of the class to load, visualize and explore data of their own choosing.

  • The proposed project has resources available with sufficient documentation: Agree. There is decent documentation about using Python and its libraries with Paraview and VTK. This information is somewhat disparate so part of the project will be aggregating this information together.

  • Sent along project description powerpoint



Week 4

2/16/2022 - 5 Hours

  • Read documentation and tutorials on Pvpython, Virtualenv, Tweetvr

  • Was able to load a simple data set in Paraview

  • Got TweetVR working in my headset

  • Studied the repo to see ways I can implement API directly in my project

  • Milestone 1: Created a write up to be added to the wiki

  • Journal Evaluation:

  • Journal activities are explicitly and clearly related to course deliverables: 2

  • Deliverables are described and attributed in wiki: 2

  • Report states total amount of time: 5

  • Total time is appropriate: 3

  • I feel as if my journal could improve in every way. While I do meet the 10 hours and tend to jot down some notes about what I have read, I need to be more meticulous about adding to the wiki and really tracking my work. I also have a habit of doing all of my hours the night before class.


2/17/2022

Review of Paul’s Journal

  • Journal activities are explicitly and clearly related to course deliverables: 4 He has pretty consistent and detailed entries so far. I think they line up with course deliverables.

  • deliverables are described and attributed in wiki: 3, Not many Wiki contributions yet but he hasn’t finished his project. He does link to his proposals though.

  • report states total amount of time: 4, Really good and consistent accounting of hours.

  • total time is appropriate: 4, So far he is keeping up with the 10 hours a week.

2/21/2022 - 2 Hours

  • Abandoning my previous plan

  • Pvpython frustrating to use

  • Not very excited about the software I have chosen

  • After seeing the WebVR demos in class, I want to do A-Frame for ease of use

  • Revised my new project plan: Using Python to preprocess sentiment data and then visualizing it in A-Frame



Week 5

2/22/2022 - 6 hour

  • Started doing A-Frame tutorials like keyboard controls, making primitives

  • Read a lot of content on this page: https://aframe.io/docs/1.3.0/introduction/

  • Checked out some of these demos/lessons: https://aframe.io/aframe-school/#/

  • Set up a basic scene in Glitch.com hosting

  • Tried to set up collaboration using Deepstream.io tutorial

  • Discovered it has been deprecated

  • Began to research A-Frame components as well

  • Played around with a couple including:

  • https://github.com/morandd/aframe-heatmap3d

  • https://github.com/3DStreet/3dstreet


2/23/2022 - 5 hour

  • Registered for Twitter API and learned documentation

  • Created a Jupyter notebook to process data for my project

  • Installed Textblob on my computer

  • Created a rudimentary scrapping script in Python

  • Performed my first data scrap on NBA tweets

  • Determined what columns to drop vs keep

  • Got Textblob running after some trial and error

  • Some of the scores were a little off



Week 6

2/30/2022 - 1 hour

  • Created Update presentation

  • Prepped for Beatrice’s activity

3/2/2022 - 5 hours

  • Spent quite a bit of time trying to play around with reverse engineer this project: https://github.com/hopetambala/aframe-NCAAB

  • Was attempting to learn how to combine the A-Frame and data processing

  • I specifically got fixated on this because they had scripts for doing Python as a part of the project and not separately

  • Ultimately gave up and stuck to using Jupyter notebook

  • Tried to teach myself D3 to make interactive data objects

  • Prepped for Amanda and Jakobi’s activities


Week 7

3/07/2022 - 5 hours


3/08/2022 - 6 hours

  • Prepped for Tongyu and Mandy's activities

  • Finally created my finalized script for scraping data

  • Tried to get tweets further back then the last week

  • Scrapped and fully engineered a data set for sentiment analysis

  • Performed feature engineering to get a variable for time that could be easily graphed

  • Got Vader Downloaded and Working

  • Ran Sentiment Analysis with Textblob and Vader on a data set about CNN

  • Performed some small statistical tests to compare them

  • Figured out how to download the datasets as a CSV from Jupyter

  • Converted CSV to JSON


3/10/2022 - 10 hours

  • Tried to get data to work with Bryik scatter plot because it actually let you see the data

  • Gave up and decided to use Zcanter’s

  • Was able to get the component implemented in my application

  • Tried to make data points spherical with Three.js material

  • Created a whole new colormap from RGB values (This took over an hour..)

  • Lots of trial and error to get my data points to load correctly

  • The plot liked to arbitrarily choose the wrong fields on the data set

  • Had to rescrap, process, and analyze my dataset

  • Dropped all field except three: sentiment, subjectivity and time

  • Had to create really crude scaling of the time variable

  • Finally got my dataset to load correctly

  • Got my colormap working

  • Spent a lot of time trying to make the data points interactive so that you could see the actual scores

  • Designed an a-environment for the graph

  • Tried to implement the other version of Zcanter’s scatter plot which had better functionality

  • Ultimately his plot was made for geographical data more so than sentiment

  • After trying to even get the demo to work in Glitch and breaking my app multiple times, I gave up



Week 8

3/14/2022 - 3 hours

  • Prepped for Maia, Shashidhar and Paul's activities

  • Created my presentation for my project 1

  • Tried once again to Zcanters 2.0 scatter plot working for my demo

  • Specifically wanted to get spheres instead of boxes

  • Accidently taught myself a lot about three.js meshes and geometry



Week 9

3/21/2022 - 2 hours

  • Read seven scenarios paper

  • Began to map out my project 2

  • Expanding on project 1 to make a dynamic visualization application?

  • This would fall into:

  • CTV To assess the communicative value of a visualization or visual representation in regards to goals such as teaching/learning, idea presentation, or casual use.

  • CDA To understand to what extent an information visualization tool supports collaborative data analysis by groups of people.

  • This would be an app for presenting data collaboratively and potentially providing instruction or insight into the data. Like a jupyter notebook/powerpoint for data in VR


3/23/2022 - 2 hours

  • Created Project 2 presentation

  • Created the write up below:


3/23/2022


Project 2 Plan


Topic: Collaborative and Accessible WebVR Visualization Application


Goal: To allow anyone to make a VR Visualization with minimal technical knowledge or data background. Several different visualization options to suit different kinds of data and use cases for the application. To enable users to display and explain their data in an engaging way through interactive elements. I.e. a Jupyter Notebook/Powerpoint for VR Data.


Deliverables:


Further contributions to page about Socket.io

Further explanation and evaluation of A-Frame components

Documentation about implementing and interacting with A-Frame objects and components

Avatar design and control in A-Frame

UI/UX insights into what non-technical users find useful about VR, how VR interfaces can be designed to be accessible, etc



Milestones:


4/05: Have base project hosted on Glitch.io

4/07: Have evaluated and created a list of best components

4/12: Have implemented at least 5 components i.e. you can load a data set

4/14: Add in interactivity like interacting with data point or highlighting them and a basic GUI

4/19: Add rudimentary Socket.io implementation and basic avatars

4/21: Finalize first iteration of the app GUI

4/26: Have more fleshed out avatars and begin user testing (I want to test ease of use like loading a data set and choosing a visualization, how it handles multiple people, what is the collaborative experience like)

4/28: Make refinements to interface and experience based off user testing criteria

5/03:

5/05:


Activity:


  1. Everyone gets to load, visualize, and explore a data set of their choice, several preprocessed ones will be provided on a demo page.

  2. I load a data set and everyone joins with me as the host. I conduct a basic tour and explanation of the data after which everyone is free to explore and create their own insights or mark interesting data points for discussion. I will also have several predetermined data points or types of data points I want people to find and take a picture of.



Project Analysis

  • The proposed project clearly identifies deliverable additions to our VR Software Wiki

    • Not Sure (While I think there are definitely deliverables already this will be an area to consider as I refine my project plan)

  • The proposed project involves collaboration in VR

    • Strongly agree

  • The proposed project involves large scientific data visualization and identifies the specific data type and software that it will use

    • Agree

  • The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class

    • Agree

  • The proposed project includes an in-class activity

    • Agree

  • The proposed project has resources available with sufficient documentation

    • Agree



Week 10


Break Week


3/30/2022 - 2 hours


  • Discovered the XR Usability Requirements from Web3

  • Familiarized myself with the categories

  • How can I incorporate into my application?



Week 11

4/6/2022 - 6 hours

  • Finalized and linked project 2 plan

  • See 3/23 for write up of plan

  • Tried to implement Text to Speech component, A-Frame GUI component, Oculus Controllers on my Project 1 to test it for Project 2

  • The speech component wouldn’t work: https://github.com/lmalave/aframe-speech-command-component

  • Even the existing demos no longer function and using earlier versions of A-Frame does nothing

  • Outdated which it a shame as this a major area of accessibility needs

  • I got A-Frame GUI to work but couldn’t get the sliders to rescale the graph like Dat GUI

  • A-Frame GUI: https://github.com/rdub80/aframe-gui

  • Was able to get a rudimentary joystick control

  • Polished project demo for in-class activity: surrounding environment, axis colors, thickness legends, color legend

  • Read articles on Accessibility challenges in VR: https://blogs.scientificamerican.com/voices/virtual-reality-has-an-accessibility-problem/

  • Another one: https://www.unimelb.edu.au/accessibility/guides/vr-old

4/11/2022 - 3 hours

  • Re-scrapped, cleaned, analyzed and converted all my data sets for the in-class activity

  • Slightly refined the scripts both for efficiency and also for a future tutorial for the course site

  • Created alternative data sets: Lakers, Supreme Court

  • Created new applications for alternate data sets

  • Refreshed knowledge of Bloom’s taxonomy



Week 12

4/14/2022

  • Paul and I discussed each others wiki pages: Photon Pun Tutorial and A-Frame Components

  • My journal definitely needs some work, lots of gaps and missing parts

  • I rate mine on the site a 2. Mostly notes in my google drive at the moment. Need to add more wiki contributions and better links.


Completeness Review of Jakobi

Please answer yes or no to each of the following questions:

  • Approximately ~100+ hours logged. Also, hours logged is listed at top of journal. No, hours aren't accurate due to gaps.

  • Wiki contributions for first project are clearly listed at top of journal. Yes. However, one or two seem to be missing.

  • In-class activity is linked to top of journal. Yes

  • Project 1 proposal + presentation is linked to top of journal and is accessible. Yes

  • Project 1 final presentation is linked to top of journal and is accessible. Yes

  • Project 2 plan is linked to top of journal and is accessible. No

  • Project 2 presentation is linked to top of journal and is accessible. No


Quality Review

Please evaluate your partner’s overall journal using the following criteria:

  • 5 == exceeds expectations

  • 4 == meets expectations

  • 3 == mostly solid with some gaps

  • 2 == shows some progress

  • 1 == some attempt

  • 0 == not found

In addition, please explain why you chose the number you selected.

  • Journal entries are up to date 3. Some gaps in the journal.

  • Journal activities are explicitly and clearly related to course deliverables. 4. They seem related to his project 2 and VR.

  • Journal entries demonstrate project progress (e.g. links, screenshots, or mentions of failure / success) 3. He is progressing well. Not the most consistent updates.

  • Deliverables are described and attributed in wiki. 3. Some missing deliverables/wiki contributions from project 1.


4/16/2022 - 5 hours

  • Experimented with VR Viz on my existing data sets

  • Read another article/white paper: https://www.microsoft.com/en-us/research/uploads/prod/2019/08/ismar_mra_workshop_microsoft_final_draft.pdf

  • Experimented with Haptic Feedback component: https://www.npmjs.com/package/aframe-haptics-component

  • Also tried out the sound component: https://aframe.io/docs/1.3.0/primitives/a-sound.html

  • Haptic feedback works really well and with recent versions of A-Frame. Could be a unique way to add tactile functionality.

  • Got the Oculus controller to vibrate when I moved the graph from my project 1

  • Sound component is built into A-Frame so it works really well

  • Was able to get ambient sound playing in the environment

  • Also created a sound emitting box with audio drop off

  • No luck getting mono sound

  • I want to make a more interactive sound based example in my activity or demo

  • Brainstormed ideas: How can these components enhance the experience of using a program particularly data in VR?

  • Moving or manipulating the data has haptic feedback to reinforce that you are moving the graph, sounds play when you select data, you can scale the GUI based on your tastes and vision, you can verbally control the graph



Week 13

4/18/2022 - 3 hours

  • Created my progress report presentation

  • Full pivot to accessibility focused project

  • More interested in testing and evaluating A-Frame and its components

  • Want to create a write up about areas of strength and weakness for A-Frame

  • Add data visualization or collaboration a bit later

4/25/2022 - 3 hours

  • Prepped for Amanda and Mandy's Activities

  • Tested the outlines component: https://github.com/takahirox/aframe-outline

  • Interested in it as a way of adding contrast to objects for the visually impaired

  • Was able to get it to work on the example model in Glitch with an older version of A-Frame 0.8

  • Also was able to get a box to have an outline with different thicknesses and colors

  • However when I tried to get it to work with newer components in the same project, wouldn’t play nice

  • Tested A-Frame Environment component: https://github.com/supermedium/aframe-environment-component

  • Pretty easy to set up

  • Added rainforest audio to a environment scene

  • Also just tried out the different options

  • Maybe using default for safe harbor features?



Week 14

4/25/2022 - 5 hours

  • Prepped for Sayan, Tongyu and Jakobi's activities

  • Read github post about adding eye tracking to A-Frame: https://github.com/Utopiah/aframe-eyetracking-blink-controls-component

  • Unfortunately I did not have an eye tracking device and it seems like more of a proof of concept than anything I could easily add to a project let alone recommend as a feature of A-Frame.

  • Read the following article: https://www.forbes.com/sites/solrogers/2019/02/05/seven-reasons-why-eye-tracking-will-fundamentally-change-vr/?sh=2bb766ce3459

  • Github post about eye tracking problem in webvr: https://github.com/aframevr/aframe/issues/4217

  • Decided to pursue gaze controls instead

  • Watched tutorials how to implement gaze controls

  • Was able to change the color of a block by moving head

  • Tried out teleport implementations

  • I tried Fernando’s implementation: https://github.com/fernandojsg/aframe-teleport-controls

  • These worked but were reliant on specific camera and controller settings

  • Might add to in class


4/27/2022 - 3 hours

  • Prepped for Shashidar, Aakansha and Lucia's activities

  • Tested the A-Frame Aria Reader Component: https://github.com/rvdleun/aframe-aria

  • Installed Chrome Screen Reader: https://chrome.google.com/webstore/detail/screen-reader/kgejglhpjiefppelpmljglcjbhoiplfn/related?hl=en

  • The component does work but struggles inside VR. It only really works when loading an A-Frame app where it will say out loud the aria-label message but will not say it again.

  • The component is mostly abandoned

  • It is also somewhat annoying to use as the aria will instead read other messages

  • Overall a functional component if it was desperately needed but not demo worthy


4/30/2022 - 6 hours

  • Work session primarily focused on implementing the superhands component: https://github.com/wmurphyrd/aframe-super-hands-component

  • I also tried to implement a physics engine

  • I wanted to create a demo that had barriers to create an accessibility obstacle course. To make immovable barriers I needed physics as you can simply pass through blocks in A-Frame

  • Also tried to use the A-Frame walls component: https://github.com/omgitsraven/aframe-room-component

  • While being able to make rooms and walls was cool, I couldn’t easily add a physics engine to them which somewhat defeats the purpose of using the component

  • Even struggled to get blocks to fall

  • Superhands refused to incorporate the models for hands

  • Really wanted to get gesture based block grabbing so that you could reach your arms out to grab blocks but ultimately gave up

  • Almost voided my security deposit by reaching out too far



Week 15

5/2/2022: 3 Hours

  • Prepped for Beatrice, Robert and Maia's Activities

  • Tested the keyboard component: https://github.com/supermedium/aframe-super-keyboard

  • Got it to change the color of objects

  • Then tried to get the event handler to also change other characteristics like environment, lightning and more.

  • This one is a little tricky to use so it may not make the in class demonstration but I have some cool ideas for demo day.


5/4/2022: 7 Hours

  • Prepped for Paul, Jennifer and Nick’s Activities

  • Put together a class demo with several accessibility components

  • Implemented superhands but very rudimentary interactions, still has pinch functionality

  • Tried to use Annyang directly to add speech functionality instead of speech component for in-class demo since speech component is broken

  • Annyang didn't work well directly with A-Frame

  • Lots of trial and error to see what worked together, many components relied on different and often outdated versions of A-Frame like outlines and speech commands

  • Also added a few more features last minute to help make the experience more manageable like oculus movement controls. Ended up created a pretty crude implementation that sometimes broke gaze controls

  • Teleport wouldn’t work with this strange controller and camera implementation so I had to leave it

  • Tried once again to add the A-Frame room component to make an obstacle course or accessibility escape room but couldn't get collision/physics implemented properly which made it pointless


5/5/2022: 2 Hours

  • Analyzed findings from my demo

  • Need for better movement

  • Unfortunately the features my classmates asked for were mostly impossible to implement like speech commands, which I had previously tried to fix

  • Need to add guide or tutorial

  • Need to add additional functionality

  • Keyboard?



Week 16

5/10/2022: 5 Hours

  • Created my final project presentation

  • Sent slides in

  • Implemented keyboard commands into my in class activity

  • Now you can change the color of the orb

  • You can also change the brightness

  • You can also randomize the environment




Week 17

5/16/2022: 7 Hours

  • Started Flash talk slide

  • Started Poster

  • Wrote Glitch.com wiki contribution

  • Wrote Sentiment Analysis wiki contribution

  • Wrote Twitter Data, API, Python and Tutorial contribution



Post Week 17

5/21/2022: 5 Hours

  • Kept working on Poster

  • Finished Poster

  • Wrote Vader and NLP evaluation wiki contribution

  • Wrote Jupyter Notebooks wiki contribution



5/22/2022 4 Hours


  • Wrote A-Frame Usability Evaluation wiki contribution

  • Took a while as I had to consult notes and testing to formulate opinions



5/23/2022 3 Hours

  • Tweaked final demo to add instructions

  • Tried to add new events like a lightswitch



5/24/2022 3 Hours

  • Wrote A-Frame Animation Tutorial wiki contribution



5/25/2022 2 Hours

  • Cleaned up Journal.