Human perception is calibrated to everyday object sizes
Our brains evolved to judge distances and sizes of objects roughly between centimeters and tens of meters(people, animals, trees, rooms).
Objects far outside this range—such as cells, molecules, cities, planets, or galaxies—lack familiar perceptual references.
Lack of embodied interaction at extreme scales
Humans cannot physically interact with microscopic or astronomical objects.
Without the ability to move around, touch, or compare objects physically, scale becomes abstract and difficult to intuit.
Two-dimensional representations distort scale understanding
Diagrams, textbooks, and screens compress three-dimensional scale into flat images, removing depth cues.
This forces viewers to rely on symbols and numbers rather than perception.
Exponential and logarithmic differences are unintuitive
Humans naturally think in linear increments, but scale differences in science often span orders of magnitude (10x, 100x, millions of times).
For example, the difference between a cell (~10 micrometers) and a human (~1 meter) is difficult to intuitively visualize.
Embodied perspective enables intuitive scale comparison
VR places users inside the environment, allowing them to move around objects and experience relative size directly.
True 3D spatial perception
Head tracking and stereoscopic rendering restore depth cues such as motion parallax and binocular disparity.
Dynamic scaling of the user’s viewpoint
VR can shrink or enlarge the user's perspective, allowing them to experience environments at microscopic or cosmic scales.
Interactive navigation across orders of magnitude
Users can zoom smoothly between scales (e.g., molecule → cell → organ → organism → ecosystem), which helps build mental connections between levels.
Creem-Regehr, S. H. (2015). Perceiving Absolute Scale in Virtual Environments.
Reviews research on how humans perceive size and distance in real and virtual environments, highlighting challenges in accurately judging absolute scale.
Creem-Regehr, S. H., et al. (2022). Perceiving Distance in Virtual Reality.
Examines how people estimate distance in immersive environments and discusses perceptual biases that affect spatial understanding.
Rzepka, A. M., et al. (2022). Familiar Size Affects Perception Differently in Virtual Reality and Physical Reality.
Demonstrates how familiar reference objects influence human size perception and how this differs between real and virtual environments.
Radianti, J., Majchrzak, T. A., Fromm, J., & Wohlgenannt, I. (2020). A Systematic Review of Immersive Virtual Reality Applications for Higher Education.
Surveys VR applications in education and shows how immersive environments improve spatial understanding and engagement.
Zhang, J., et al. (2022). Augmented Perception Through Spatial Scale Manipulation in Virtual Reality.
Investigates how manipulating user scale in VR can alter spatial perception and improve understanding of relative object sizes.
Foglino, C., et al. (2025). The Effect of Viewing, Reaching, and Grasping on Size Perception in Virtual Reality.
Shows that embodied interaction with virtual objects can significantly improve users’ perception of size and scale.
Orders of Magnitude VR is an interactive virtual reality experience that allows users to smoothly zoom through many orders of magnitude, from the largest structures in the observable universe down to extremely small physical scales. The project was inspired by the famous Powers of Ten educational film and recreates its concept in a fully immersive 3D environment. Users can transition continuously between cosmic structures, stars, biological systems, and molecular structures, enabling an intuitive understanding of how different scales relate to one another. By placing the viewer inside the visualization, VR makes these enormous differences in scale easier to grasp than traditional diagrams.
UniverseVR is a virtual reality visualization that allows users to explore simulated cosmological data representing millions of galaxies. Built using data from the Millennium Simulation Project, the experience lets users travel through large-scale cosmic structures such as galaxy clusters and superclusters. The immersive VR environment allows viewers to perceive the spatial distribution of galaxies and the immense distances between them in a way that is difficult to convey with static images or charts. This type of simulation demonstrates how VR can make extremely large spatial scales more intuitive and visually understandable.
Due to the aforementioned related projects being blocked behind a paywall but definitive studies showing the effectivness of VR in aiding the perception of scale, I have decided to make an in-class activity that compares by project to a famous 2D scale perception website, The Scale of the Universe. Below are steps and resources available to install and test out my project. Prerequities include: a meta quest 3 and a meta account.
Please download/have access to the following:
Steps to utilize the VR simulation:
1. Please make sure developer mode is turned on- on your phone's meta horizon app, go to devices -> your device -> headset settings -> developer mode-> turn on
2. While connected USB -C to your computer, your headset will ask to allow USB debugging- check always allow, then OK
3. Download the Meta Quest Developer Hub on your computer and log in to your Meta account that is linked to your Meta Quest 3
4. Drag and drop the apk file onto the Meta Quest Developer Hub app, and put under "Connected Device: Meta Quest 3"
5. ScaleVR should be accessible to users now on the headset under applications
A post-experience survey was conducted with 9 participants to evaluate the effectiveness of the VR visualization compared to the Size of the Universe website. Participants interacted with both systems and then rated immersion, usefulness for understanding scale, and provided qualitative feedback. Overall, the results indicate that the VR simulation substantially increased perceived immersion and slightly improved participants’ understanding of scale, while the 2D website provided clearer comparisons across objects and scales.
Across immersion ratings, the 2D website received mostly low-to-moderate scores, while the VR visualization consistently received higher scores. Specifically, the 2D website ratings were distributed as follows: 2 (2 responses), 3 (3 responses), 4 (1 response), 5 (1 response), 7 (1 response), and 8 (1 response). The mean immersion rating for the 2D website was approximately 4.1 / 10. In comparison, the VR visualization received ratings of 6 (1 response), 7 (3 responses), 8 (2 responses), and 9 (3 responses), with an average immersion rating of approximately 7.8 / 10. This represents an increase of roughly 3.7 points on a 10-point scale, indicating that the VR environment significantly enhanced participants’ sense of immersion. Several respondents attributed this improvement to 3D depth perception, the ability to look around objects, and the feeling of experiencing scale in a more “real-life” manner.
Participants also rated how helpful each visualization was for understanding relative scale. The 2D website received ratings between 4 and 9, with most responses clustering around 5 to 8, producing an approximate mean of 6.4 / 10. By contrast, the VR simulation received only 7 or 8, producing an approximate mean of 7.6 / 10. The results suggest that VR improved perceived understanding of scale by about 1.2 points on average. This indicates that immersive spatial visualization provided a modest but measurable benefit in helping participants conceptualize large physical scales.