Austin Phan's In-Class Activity 1
Comparing 2D and VR LiDAR Visualizations
Welcome to my in-class activity!
As a quick reminder: I'm visualizing LiDAR-scanned data, and visualizing it in VR. We'll be using this NEON dataset (here). No need to do anything with this yet!
Class Preparation (5 mins)
Please pre-sideload my app onto your quest devices using SideQuest. At this point you should have completed activities that require this, but if you still need help, the link to the guide is here.
That's it! It shouldn't take too long. Please let me know if you have any issues,
Part 1: 2D Visualization (15 Mins)
This dataset, scanned by the National Ecological Observatory Network (NEON) has some 2D visualizations online.
You can begin by accessing the dataset here. Your landing page should look like this:
Scroll down to the "Visualizations" section of the page (it's near the bottom). You should be able to see something called AOP Data Viewer.
There are a few sliders and selectors, we'll be focusing on the Site and Year settings for the visualizer.
I've focused on one site for this visualization, so we'll look at that. Next to the map icon, there should be a dropdown menu with an identifier for each site.
Task: From the dropdown, select DELA, which looks at a scan for the Dead Lake area in Alabama.
Task: Select 2015 on the time slider, and press the "open in new window button for the best fullscreen experience".
Task: Also select 2021 on the time slider, and press the "open in new window button for the best fullscreen experience".
You should now have two tabs open, one with 2021, and 2015 datasets.
You should notice something, though: all of the datasets are just a single color! It doesn't look like a very helpful amount of information. That's because the range of values isn't well captured with these settings.
If you click on the graph, you'll be able to see something called a CHM value (see image on right), which is what helps the graph determine what color to display a datapoint.
Click around and explore what values show up, and what ranges might work best to correctly color the dataset.
You can manipulate the color ranging and schemes for the visualization on the bottom bar. Try it now.
Task: Experiment with the color settings to more adequately display the data. Do it for the two years you have open.
Task: Now make sure the ranges for the datasets are the same. Can you spot any differences between the scans that could show ecological change? Take a screenshot of the two areas, and if possible, circle the areas you've noticed are different.
Note: Scans for these datasets aren't all the same dimensions. You'll need to find an area covered by both scans.
Part 2: VR Visualization (10 Mins)
Hopefully you've loaded the .apk file mentioned in the preparation section onto your quest. If you haven't please load it now.
Power on your quest, open the app drawer, press the filter item in the top right, and select unknown sources. The name of the application you want to load should be called BlenderRender. Go ahead and launch it now. If it asks you for file access, select "Allow".
You should be loaded in a world with 6 different 3D-converted LiDAR scans, and 6 corresponding 2D texturemaps on the wall. You should be able to move around by teleport, and grab the 3D models. When released, the models will remain in place. 2D models are not manipulatable.
Collisions are not enabled. If you accidentally exit the world or put a model through a wall, you can restart the level by pressing the top button to open the menu and press restart to set all items to their original orientation.
Performance seems to be a bit jittery, due to the number of datasets that are loaded in (many polygons). To improve performance, you can move the datasets around so that you're only ever looking at a few at once, rather than all 6.
Both controllers have grab functionality.
You can teleport using the right hand joysticks.
The top buttons (X) on both sides open a menu, which allows you to reset the level (restart), and exit the game (reset orientation does not do anything). Point to what you want with the other controller that the menu is not attached to.
Task: Explore the world and it's datasets. Take a screen recording of you moving the dataset around and (possibly comparing two datasets). If you can find a difference between the sets, take a screenshot of that too! Collisions are turned off, so you can also put datasets inside of one another, should that help comparison.
Unreal engine also has dynamic lighting, so you can bring items closer to light sources to brighten them, and darken them. You can experiement with that too (not required, but cool!). Lighting sources exist in the ceiling, pointing towards the images on the wall, and in the back corners of the room.