Papa-Yaw Afari's Journal
before after
---- ----
1 | 4 | articulate AR/VR visualization software tool goals, requirements, and capabilities;
| 3 | 5 | construct meaningful evaluation strategies for software libraries, frameworks, and applications; strategies include surveys, interviews, comparative use, case studies, and web research;
| 2 | 5 | execute tool evaluation strategies;
| 1 | 3 | build visualization software packages;
| 3 | 5 | comparatively analyze software tools based on evaluation;
| 2 | 5 | be familiar with a number of AR/VR software tools and hardware;
| 3 | 5 | think critically about software;
| 3 | 5 | communicate ideas more clearly;
| 1 | 5 | build a vocabulary of terms, cues, and best practices for AR/VR experiences
PROPOSALS
Project 1 Proposal <https://docs.google.com/document/d/1ii7yYtzvd3pFGqmkNkMwE3YO2OraZHHHV8CKx7us_vA/edit?usp=sharing
>
Presentation for Project 1 Proposal <https://docs.google.com/presentation/d/1onLfw7ipaaS2pnSKs89I3V5URwgmoSj9qbCvGwBA130/edit?usp=sharing>
End Presentation for Project 1 <https://docs.google.com/presentation/d/1QEWjhr_E_xFA4Kdcyd3QCdLFNmonvkPl9efdiixT9QU/edit?usp=sharing>
Project 2 Proposal <https://docs.google.com/document/d/1DwMRlbUB_Zkc8CTEeQep1Ohlq4HNiCgu58IxqIIZsKQ/edit?usp=sharing>
Presentation for Project 2 Proposal <ADD LINK>
Poster <ADD LINK>
In-class Activity <ADD LINK>
Public Demo <ADD LINK>
ALL OTHER WIKI CONTRIBUTIONS
CONTRIBUTION 1 [This tutorial provides a step-by-step guide on integrating Google APIs with Python, covering authentication, API setup, and real-world applications like Google Drive and Google Sheets. It includes code examples, troubleshooting tips, and expansion possibilities for automating tasks with other Google services like Calendar, Gmail, and Maps. I utilized for my gait project and thought it would be helpful] <Google API Python Tutorial>
CONTRIBUTION 2 [This tutorial provides a step-by-step guide for installing and setting up MediaPipe on Windows for pose estimation and gait analysis. It covers installation, common errors, and troubleshooting solutions, ensuring compatibility with virtual environments and OpenCV. Additionally, it includes a test script for real-time pose tracking, making it easy to verify the setup.] <MediaPipe Installation Tutorial for Windows>
CONTRIBUTION 3 [A synthesized report of the feedback I got from my in-class activity which piloted a VR Gait study. These are essentially my major takeaways]
<Synthesized VR Gait Analysis Activity Report>
CONTRIBUTION 4 [A video of all the stages of my Python Project + Demoing of MediaPipe] <VR Gait Analysis Demo in Python(MediaPipe)>
.....
CONTRIBUTION N [short description] <ADD LINK>
HOURS SUMMARY
Total: 94 hours
HOURS journal
1/27/25 - 4 Hours (total)
1.5 - Hours
Created individual journal
Joined Slack Channel and Introduced myself
Worked on this work due at Noon
Read course material on Project, Ideas page, and Scientific Data
1.5 Hours
Looked through the Wiki to think of potential answers to the questions below :
Three changes should each require ~10 minutes to complete.
I think I will probably think about ensuring all the links to Software in the VR Software Development section have accurate links and are not outdated. I was thinking of implementing a more robust system of finding these software, so maybe some video run-throughs that are like 5 to 10 seconds long embedded above each website link showing how the website looks.
Adding something that is a bit more updated, such as recent news or newly added section, would potentially be helpful to see what links people are actively using.
Add more unique Frequently Asked Questions to an FAQ section.
Three changes should each require ~1 hour to complete.
I would add more recent case studies in the "VR in Education" subsection. Since there are constantly articles and research papers being written about VR's practicality in Education, it would make sense to constantly update the section with information.
The VR Modelling Software section looks like it could use one or two more visual cues per entry, so adding relevant images, diagrams, and videos to illustrate the capabilities of the VR and CAD tools.
Adding more table-based visual representations to show the pros and cons of a given topic in the wiki. What first comes to my mind is comparing VR hardware and software, but I feel like having a pros and cons table for some of the newest headset models would be extremely helpful.
The final three changes should each require ~10 hours to complete.
I vividly remember growing up, there was a book that translated Shakespeare's famous books into a more modern language that anyone could easily read. So, we could apply this same concept to the wiki. Maybe creating a dictionary or huge database full of words that are readily available to be defined, or even a page giving a rough overview of what VR/AR are on a high-level and in-depth way while still being relatively easy for anyone without coding experience to understand.
Having certain papers that are in a foreign language translated or even translating essential parts of the main wiki into other commonly spoken languages, i.e., Spanish or Chinese. This is a more wild idea but still something I was thinking.
Doing VR Hardware reviews would require researching and testing the latest headsets, controllers, and tracking systems and comparing their features and effectiveness. I feel this information could be in the form of a student blog reviewing all the products they have tested. It can be light-hearted, similar to how music critics critique music. So, as opposed to the current format, I would embed a first-person perspective review of the products as well.
1 Hour
Project Question Formation and Outside Articles/Research
I started thinking about potential things I wanted to base my project on.
I found a paper on VR's role in Rehabilitation with Prosthetics that thought was particularly interesting and useful to what I am interested in. It makes me think more about what steps are being taken to utilize VR for medical purposes.
https://opcenters.com/the-impact-of-virtual-reality-in-prosthetic-rehabilitation/
1/29 -30/2024 - 7 Hours Total
4 Hours
Charged and updated headset
Had some trouble connecting to wifi
Set up Headset and utilized the Meta App
Set up the VM
Was unable to use set up Virtual Desktop because the referral link did not work
Asked the Slack for help at 2am
Ran into so many complications unfortunately which delayed my progress.
2 Hours
Project Idealization
Looked through projects for inspiration
Stumbled upon one project relating to medical research and MRI's
I have a particular interest in Prosthetics and Sports and how VR and AR can be a useful tool in that sphere.
Looked up any existing examples of VR research with a Prosthetics focus
https://jneuroengrehab.biomedcentral.com/articles/10.1186/s12984-020-00743-w
1 Hour
Project Ideas (So Far):
VR-Based Prosthetic Training
A virtual reality simulation that helps amputees practice using prosthetic limbs in different environments before real-world application.
Augmented Reality: Low, Immersive VR: High, Scientific Visualization: High
Sports Performance Optimization with VR
Using VR simulations to analyze an athlete’s form and suggest biomechanical adjustments, with potential applications for prosthetic-assisted athletes.
Augmented Reality: Medium, Immersive VR: High, Scientific Visualization: High
Neural Feedback for Bionic Limb Control in VR
Exploring how brain-computer interfaces (BCIs) could allow users to control prosthetic limbs in a VR environment before real-world implementation.
Augmented Reality: Low, Immersive VR: High, Scientific Visualization: High
2/3/25 - 3 hours
Completed Bezi Lab from last time
Completed Virtual Desktop from last time that was not working
Completed Google Earth VR activit
Google Earth VR Assignment
My Place of Residence
BOX Olney House
My Dorm
The Main Green

Journey from my Dorm to Main Green
2/4/25 - 5 Hours
Set up DinoVR
Read the DinoVR research paper
reviewed Wiki for more project inspiration
Comprehensive Project Brainstorm/Planning
How VR Mode Affects Gait: A Biomechanical Analysis of Passthrough vs. Immersive Mode
Project Overview
I’m exploring how virtual reality affects a person’s ability to walk in a straight line, specifically comparing passthrough mode (where users can see their real environment through the headset) versus fully immersive VR mode (where users are entirely in a virtual space). This project focuses on the biomechanical importance of gait and how VR influences balance, step length, and posture.
Walking is a fundamental part of human movement, and any disruptions in gait mechanics can have implications for sports performance, rehabilitation, and prosthetic adaptation. By studying how people adjust their gait under different VR conditions, I hope to provide insights that could improve VR training for athletes, prosthetic users, and physical therapy patients.
Project Objectives
Three Key Actions:
Designing and Conducting a Gait Experiment in VR
Participants will walk a set distance in both passthrough mode and immersive VR mode while wearing a VR headset.
Their walking path, step length, and any deviations from a straight line will be recorded and compared.
Measuring Biomechanical Changes in Gait
Using motion tracking software (e.g., OpenPose for 2D tracking or Vicon/Xsens for advanced biomechanical motion capture) to measure stride length, balance stability, and postural compensations.
Analyzing changes in gait stability in immersive mode versus passthrough mode.
Evaluating User Comfort and Cognitive Load
Collecting participant feedback on how stable and secure they felt while walking in both VR conditions.
Investigating whether users feel safer in passthrough mode and if this affects their gait mechanics.
Software and Tools
This project will leverage a combination of VR software, motion tracking, and biomechanical analysis tools:
VR Platforms: Meta Quest 3
Motion Tracking & Biomechanics Software:
OpenPose – A free, open-source 2D pose estimation software to analyze gait mechanics.
Vicon or Xsens – If available, these can provide 3D motion capture for precise biomechanical tracking.
Unity / Unreal Engine VR Development – If needed, to create simple VR environments for immersive mode testing.
Data Analysis: Python (NumPy, Pandas, Matplotlib) or R for statistical analysis of gait changes.
Class Activity: “VR Walk the Line” Experiment
Walk a straight path in both passthrough mode and immersive VR mode.
Record observations on balance, step length, and deviations.
Compare experiences and discuss real-world applications in sports, rehabilitation, and VR safety.
Potential Deliverables
Gait Stability and Biomechanical Report
A study comparing gait changes in passthrough vs. immersive VR mode using motion tracking data.
Includes step length, balance, and straight-line deviation visualized through graphs and motion analysis reports.
Potential Tie ins to other fields:
VR Safety and Sports Training Insights
Recommendations for VR developers and coaches on how immersive mode impacts movement stability.
Could be useful for sports training simulations or designing VR rehab programs for athletes recovering from injury.
Prosthetics and Rehabilitation Applications
Insights on how prosthetic users might adapt to VR environments, particularly in learning new movement patterns.
Examining if VR training environments could help prosthetic users adjust to new limb mechanics.
Connections to Sports, Biomechanics, and Prosthetics
Sports Performance: Understanding how VR immersion affects balance and gait can help coaches and trainers design better VR-based training drills for athletes, particularly in sports that require precision footwork (e.g., soccer, basketball).
Biomechanics & Rehabilitation: Gait changes in VR could inform physical therapists and clinicians on how VR-based rehab programs should be structured, particularly for injury recovery and fall prevention.
Prosthetics & Mobility Training: People using prosthetic limbs rely on visual and spatial cues to adjust their gait. If VR immersion disrupts natural gait patterns, this could impact how prosthetic users train in virtual environments.
Some Goals I set:
Run a pilot test on myself – Walk in passthrough vs. immersive mode and record observations.
Determine best tracking method – Decide if I’ll use OpenPose for basic tracking or seek access to more advanced 3D motion capture software.
Research existing VR gait studies – Find previous literature on gait analysis in VR to build a stronger foundation for my project.
Why This Matters
This project is about more than just how people walk in VR—it’s about understanding the fundamental relationship between visual perception and movement. Whether it’s for athletes training in VR, patients recovering from injuries, or prosthetic users adapting to new mobility, understanding how VR affects gait can help design better training tools and improve VR safety.
2/6/2025 - 4 hours
Finished the in-class assignment
Read some articles relating to my project, but need to do more research
Some of the papers:
Needed to get some info and learn more about gait analysis and how that is even practical or admissable to do.
Made a project proposal slides
Need to add stuff about adding deliverables to the Wiki.
https://docs.google.com/presentation/d/1onLfw7ipaaS2pnSKs89I3V5URwgmoSj9qbCvGwBA130/edit?usp=sharing
Just finished commenting on classmates pre-project proposals.
2/10/2025 - 2 Hours
Worked on project proposal slides
Worked on project proposal document
Emailed Melvin and Jakobi the proposal slides.
Read and Revised my project based on feedback received from Professor Laidlaw
2/16 - 20/2025 - 4 hours
I have been reading the research papers and articles and papers I had linked during 2/6/2025 and officially tried to run OpenPose on an image to see if it creates the skeleton of my gait and posture.
Spent an hour debugging the sample code that was used to test OpenPose and get it to work to no avail.
Started to think a bit more critically about the project's direction as a whole. I have started thinking it might be best to test people's posture before even testing their gait or something along those lines. Instead of walking and recording a video, I can use 3 still images from the video I recorded of someone walking to compare their gate.
So, I have been coming towards actually implementing the project that I have drafted and will start working on it on Sunday officially and run a pilot test on myself to see the practicality of the whole assignment.
Working on the Ben + Paraview Setup/Tutorial
2/23/2025 - 5 Hours
Continued Setup for OpenPose
Tested the online OpenPose editor on still images
I constructed a more comprehensive and simpler plan for my project experiment
Adapted my project question to time constraints.
Created sample presentation for class that I will edit and refine.
2/24 - 3/1/2025 -24 Hours
Continued the OpenPose setup
Ran into NVIDIA CUDA Download Issues
Ran Into Caffe Setup issues
Ran into Visual Studio 2022 build errors
Ran into OpenCV compatibility issues
Resolved most of the issues except the Caffe issues which ended up being the nail in the coffin for me wanting to use OpenPose
I was frustrated because I did a lot of work with administrative settings on my windows to allow permissions that would let setup be seamless and straightforward.
However, no matter what I tried I could not build OpenPose correctly.
However this did not stop me, so I decided to pivot to MediaPipe, which is slightly less efficient at pose detection.
3/1 - 3/3 - 10 Hours
Set up MediaPose
Downloaded necessary packages and libraries
Downloaded a older version of Python to work with MediaPipe
Set up virtual environment that has relevant packages inside it.
Setup Webcam Pose Tracking
Setup uploaded video pose tracking
Started to consider variables such as step length, stride length, etc.
This data is located in a Google Sheet that is setup to be linked to the Google Sheets API.
Started running pilot test on videos of Messi walking and trying to track his gait and the distance he is walking.
Potentially consider the height of the participant went scaling to understand the distance and scale that each person is walking. So by finding out someones height we can estimate their torso /legs length and average stride of someone at that height and then apply that logic to scale the values in the sheet.
Maybe creating a section or a feature where you put the person's height before calculating everything.
For potential delievrables make a tutorial on how to setup the Google Sheets API, so that it is clear how to set it up to use in this project. Also I do not believe it has been done before.
For 3/4: Run Pilot test in VR, since the original videos I did were with no headset on.
3/9 - 3/11 - 7 Hours
Ran Pilot Test and Analysis in Python
Recorded a demo video and added it to In-Class activity document
Created In-Class Activity document
Continued fine tuning the scaling for the analyzing portion of the project.
Thoroughly thought through how to make the in-class activity as engaging and seemless as possible.
Downloaded the apps Eunjin activity tomorrow.
Added the section with someone's height being a factor
Added side by side comparison of videos.
3/12 - 3/13 6 Hours
Finished Final Presentation
Created my wiki deliverables
MediaPipe Tutorial
Google API Tutorial
Gait Feedback Report
Demoing Video
Analyzed the results of project
I am thinking of adding my research papers I read that inspired my project to a wiki contribution tomorrow
Revised Final Presentation
3/18/2025 -1.5 Hours
[Preliminary] Project 2 Proposal completed
Read Seven Scenarios Paper
3/20/2025 1- Hour
Finished Evaluation + Edited Proposal Slides/Document
Thinking of switching Project 2 subject matter due to its difficulty and lack of appreciation/proper implementation for Project 1
Drafting In-Class Activity for Potential 2nd Project
Drafting potential new project ideas for if I need to switch my project.
Evaluated my project plan with rubric:
Clearly Identifies Deliverable Additions to Our VR Software Wiki
Score: 5
How/Why? Identifies multiple wiki contributions, including a comparison between MediaPipe and OpenPose, a deliverable on the virtual environment developed, and documented research papers. The plan explicitly states the intention to add these insights to the wiki.
Involves Passthrough or “Augmented” in VR
Score: 5
How/Why? The project focuses on comparing passthrough mode versus immersive mode in VR and specifically includes passthrough tests with added obstacles, ensuring AR components are integral to the study.
Involves Large Scientific Data Visualization and Identifies Specific Data Types & Software
Score: 4
How/Why? The project involves statistical and machine learning analysis of gait data, a 3D visualization of gait for expert review, and a formal comparison between MediaPipe and OpenPose. However, maybe would benefit from explicitly confirming the visualization software to be used for 3D rendering and analysis.
Has a Realistic Schedule with Explicit and Measurable Milestones
Score: 5
How/Why? The milestones are detailed, with clear deadlines for each task, including experiment setup, debugging, and deliverables. The schedule is structured weekly, ensuring steady progress toward project completion.
Explicitly Evaluates VR Software, Preferably in Comparison to Related Software
Score: 4
How/Why? Plans to compare MediaPipe with OpenPose and evaluate different motion tracking algorithms, but maybe could be strengthened by including additional comparisons to other motion capture methods or VR platforms beyond MediaPipe/OpenPose.
Includes an In-Class Activity (Formative or Evaluative)
Score: 1
How/Why? An in-class activity is not fully explicitly planned, with a milestone to refine and present the collected data. Still a work in progress.
Has Resources Available with Sufficient Documentation
Score: 3
How/Why? The project references research papers and tutorials but mentions uncertainty regarding the final virtual environment. Clarifying final software/tools for data visualization and gait analysis would strengthen this area.