This project will visualize earthquake data to represent key attributes such as magnitude, scale, geographic impact (radius), depth, and economic consequences of a selected range of earthquakes in a particular region (rather than a broad cross-country or cross-region comparison). The project will incorporate haptic feedback to enhance the experience of earthquake magnitude in the visualizations themselves.
To achieve this, I plan to use Meta’s Haptics Studio and Haptics SDK, which integrate with Unity, to develop custom haptic effects. These effects could be generated based on seismic waveforms, earthquake audio data, or magnitude charts, allowing users to physically sense the intensity of different earthquakes.
I also aim to compare different versions of the visualization to better understand their effectiveness in various scenarios. Instead of simply contrasting a 2D vs. AR version (as in my first project), this project will instead explore
A focused visualization, representing only scale and magnitude to emphasize core earthquake properties.
A comprehensive visualization, combining multiple variables (magnitude, scale, depth, economic impact, and geographic distance) to assess how layering information affects user interpretation and engagement.
The project will also compare haptic vs non haptic visualizations to assess the impact and effectiveness of haptic responses in understanding and interacting with the earthquake data.
Students will view and interact with an AR earthquake visualization where they experience different earthquake data for a particular region through haptic and visual feedback. The base visualization will display a particular region (in a map-like format) with overlays for earthquake data, showing data of 3-5 distinct earthquakes in that region.
Students will compare multiple versions of this base visualization (2-4), each with different representations of the same earthquakes, comparing:
Haptic vs. non-haptic experiences — Does touch feedback improve understanding? (i.e. comparing a visualization with haptic features vs without)
Focused vs. comprehensive visualizations — Which is more effective for data interpretation? (i.e. comparing a visualization with more variables vs less)
Primary: USGS (United States Geological Survey) real-time and historical earthquake data — https://www.usgs.gov/programs/earthquake-hazards (range of data for worldwide earthquakes with interactive maps)
More potential datasets to explore:
CORGIS Dataset Project Earthquake Data —https://corgis-edu.github.io/corgis/csv/earthquakes/ (CSV data for earthquake magnitude, location, depth, significance -- but only till 2016)
Kaggle Earthquake Datasets
Earthquakes in Indonesia (Earthquake Repository, by BMKG): https://www.kaggle.com/datasets/kekavigi/earthquakes-in-indonesia
National Earthquake Information Center Data, from 1965 - 2016: https://www.kaggle.com/datasets/usgs/earthquake-database
Haptics in Unity (for Meta Quest)
A-Frame Haptics Component:
Based on my software comparison table for integrating haptics into my project, I decided that Unity with Meta Haptics Studio would give me the most control and flexibility for building my earthquake visualization. Since our class already uses Meta Quest headsets, I didn’t need to worry about cross-platform support, and Meta Haptics Studio had direct integration with Unity (via the Haptics SDK) and Quest devices. I had already used Unity in my first project, so felt that i had surpassed the learning curve and was comfortable using it again.
Additionally, Meta Haptics Studio would make it especially easy for me to test my haptic effects quickly, as using the headset, I could audition each one in real time without having to rebuild my Unity scene. It would also let me fine-tune a wide range of settings like breakpoints, intensity, and frequency, giving me more detailed control than other platforms I tested. One of the most helpful features was the ability to turn sound files into haptic feedback, which was perfect for converting earthquake audio into vibration. Combined with strong documentation and tutorials, this setup let me prototype and improve my project efficiently.
Retrieving Earthquake Data:
USGS Website
Tool by Seismological Facility for the Advancement of Geoscience (SAGE)
Provided seismographic data for individual earthquakes
Unity + Mapbox Integration:
Haptics Integration Software:
A-Frame Haptics Component
Unity XR Interaction Toolkit Haptics
Meta Haptics Studio
Haptic Studio with Unity Integration Tutorials
Link to final class activity page
Based on my class activity results, I found the following:
2D vs AR Experience:
Students reported significantly lower immersion and engagement in the 2D version compared to AR.
Understanding, confidence, and data recall were all rated substantially poorer in 2D.
Users noted a tradeoff between cognitive and physical load in 2D, but this tradeoff wasn’t present in AR.
User Choice in AR (Part 2 vs Part 1):
AR Part 2, which offered more user controls, was slightly preferred over AR Part 1 in terms of confidence, understanding, engagement, and ease of navigation.
7 out of 9 participants preferred having control over what visual elements to display.
While about half found the extra options manageable, others felt mildly or highly overwhelmed—suggesting user preference plays a role.
Nearly everyone agreed that increased control improved comprehension of the data.
Visualization Features and Controls:
Features controlled by buttons (e.g., scaling, text toggling) were seen as more intuitive than those toggled differently (e.g., error bars, map overlays).
Error bars were considered highly informative but also confusing and visually overwhelming for some.
Non-button-mapped features, like haptic feedback and map view changes, were described as the most immersive and interactive.
Map movement using triggers was reported as unintuitive by many participants.
Haptics Feedback:
Most users found the haptics made the experience more immersive and memorable.
Several participants actively sought out high-magnitude earthquakes just to feel stronger haptics, using them for quick comparison.
There were mixed views on haptic sensitivity—some clearly felt differences between magnitudes, while others found them too subtle without visual cues.
Overall, haptics encouraged deeper interaction and exploration of the map.