written by Justin Park
Geospatial data refers to information that is tied to a specific geographic location on Earth. When brought into a Virtual Reality environment, this data allows users to experience real-world phenomena like deforestation, climate change, or urban growth as immersive, navigable 3D spaces. This page covers the key data formats used in geospatial VR projects and how tools like Global Forest Watch can fit into such a pipeline.
A GeoTIFF is a standard raster image format (based on TIFF) that embeds geographic metadata directly into the file. This metadata includes:
Coordinate Reference System (CRS): defines how pixel coordinates map to real-world lat/lon positions (e.g., WGS84, EPSG:4326)
Spatial extent: the bounding box (min/max latitude and longitude) that the image covers
Pixel resolution: how many meters on the ground each pixel represents (e.g., 30m/pixel for Hansen data)
Because GeoTIFFs encode both visual data and geographic positioning, they are the standard format for satellite-derived datasets used in scientific research.
The Hansen Global Forest Change dataset (produced by researchers at the University of Maryland, published annually) is one of the most widely used satellite datasets for tracking deforestation globally. It is derived from Landsat satellite imagery and provides 30-meter resolution data worldwide.
Key layers in the dataset:
For VR visualization purposes, the lossyear layer is most useful — it allows you to reconstruct the cumulative state of forest cover at any point in time by filtering pixel values up to a given year.
Data source: Available via Global Forest Watch or directly from the GLAD Lab at UMD. Tiles are organized by latitude/longitude grid (e.g., 00N_050W.tif covers 0°–10°N, 50°–60°W).
Global Forest Watch (globalforestwatch.org) is a web platform built by the World Resources Institute that provides interactive access to forest monitoring data, including the Hansen dataset.
Interactive map with toggleable forest loss/gain layers
Area-of-interest analysis: draw a polygon and get year-by-year tree cover loss statistics (in hectares)
CSV exports of statistical summaries
Access to deforestation alerts (near-real-time monitoring via GLAD, RADD, and other alert systems)
Biodiversity, land use, and climate overlay layers
While GFW's primary interface is a 2D web map, its data and API have several potential applications in VR contexts:
1. Region Identification GFW's interactive map is useful for identifying regions with visually dramatic deforestation patterns (e.g., Pará's agricultural frontier). These regions can then be targeted for GeoTIFF download and VR import outside of GFW.
2. Statistical Overlays GFW's CSV exports (year-by-year hectare loss) can be used to drive data dashboards inside VR. For example, floating bar graphs that update as users move through different time periods in the experience.
3. Live Deforestation Alerts The GFW API provides programmatic access to near-real-time deforestation alerts. A VR application could use this API to highlight active deforestation zones with visual markers, creating a "live" environmental monitoring experience.
4. Protected Area Boundaries GFW hosts shapefiles for Indigenous territories, protected areas, and concessions. These boundaries could be rendered as visible zones in VR to contextualize where deforestation is occurring relative to protected land.
5. Tree Canopy Cover for 3D Placement The treecover2000 baseline layer could inform procedural 3D tree placement — areas with high initial canopy cover get dense 3D forest models, while loss-year data progressively removes trees as the user advances through time.
GFW's download portal exports statistical CSVs, not any files useable in softwares like Unity for VR. Actual GeoTIFF tiles must be sourced directly from GLAD/UMD or Google Earth Engine.
The platform is designed for 2D web analysis; integration with game engines requires additional data processing (see the companion wiki page on importing GeoTIFFs into Unity).
Beyond deforestation, geospatial data is increasingly used across VR applications:
Climate visualization: sea level rise projections rendered as interactive flood simulations
Urban planning: LiDAR point clouds of cities imported into VR for architectural review
Disaster response: real-time satellite imagery overlaid on 3D terrain for emergency coordination
Archaeology: drone photogrammetry and DEM data used to reconstruct historical landscapes in VR
The core pipeline — satellite or sensor data → geospatial raster format → 3D engine import — is consistent across these domains.