Tal Frieden Journal

Time Log:

1/30:

Created my journal! Played around with Google Sites to familiarize myself with the interface. Created VR in Pop Culture page, added a short blurb about deep fakes/potential for deep fakes and VR. Added to the b3.js page. (1.5hr)

1/31-2/3:

  • Set up slack, got more acquainted with Google Site (Course Activities, Syllabus, other folks' journals) (1hr)

  • Fell into a bit of a rabbit hole with the Miami Murals Project, an AR app associated with a mural in Wynnwood that shows the future of climate change in Miami, which I wrote about in VR in pop culture. I also emailed the developers, artists, and organization that created it. (2.5hr)

  • Brainstormed, wrote up two project titles, descriptions (1.5 hr)

  • Watched some videos (1, 2, 3) about network visualization in VR. Then, read the comment sections for instructions. Ended up at this GitHub repo which has code that converts Gephi graph visualizations to interactive VR visualizations. Downloaded Unity and played around with it a little bit. Also read up on this GitHub repo, another network visualization project using the Twitter API. (1 hour)

  • Looked at Twitter visualizations, wrote up project description for Twitter visualization modification (1hr)

  • Downloaded and installed Unity, played around with it (1hr)

Project Titles and Descriptions:

  • Sea the Change (AR for climate change visualization--name a work in progress)

    • AR app for mobile that uses location data to visualize the outcome of climate catastrophe in the user's location. For example, in a sea-level locale, it would show stormy waves. In an area likely to become an arid desert, it would show a sandstorm. In addition, the app could prompt the user and those around them to consider some questions. Examples:

      • How do you plan on surviving the flood? Can you swim?

      • Will you stay in the desert? If you plan on leaving, will you drive?

      • Do you know how to farm? Can you sustain yourself and your loved ones without today's infrastructure?

    • After these questions, the app would use location data to display nearby environmental justice organizations and their contact information. It would show the user how to become involved. If there is critical legislation up for debate in the locale, the app would display ways of supporting the legislation (call a senator, mayor, etc.)

    • Three things I would do to make this happen:

      • Research data sources on predicted sea-level rise and other effects of climate change (coordinates that will be flooded, etc.), figure out if it's possible to map this onto user location data.

      • Research other AR projects that alter the environment that the user is in rather than simply add disparate objects to the "real" environment.

      • Come up with a program that searches for organizations/legislation data based on location and serves it to the user in a reader-friendly way.

    • Class activity:

      • Class could walk around college hill/RI and look up their coordinates and see where sea level will be. We can talk as a class about how we plan on dealing with climate catastrophe.

  • PowerNexus

    • VR app for HMD that would allow users to enter "conflict maps" that would visualize political conflict by representing the actors at play, as well as the networks of power that exist around the conflict. I'm envisioning a complicated network of ties represented in 3D that the user could interact with to expand an explanation of the connection between the actor and the conflict. For example, one could visualize the political upheaval in Venezuela, and they would be able to interact with the networks of power at play by seeing the connection between the US, OPEC, and others to the conflict. The user would be positioned at the center of the nexus/web of connections, and could explore each connection by interacting with it in some way (clicking? touching?) and then see and hear an explanation of the connection in the conflict. This could hypothetically be a new way to experience journalism and academic work.

    • Three things to do:

      • Learn more about news/storytelling tools that have been used in VR, and if they're doing network visualization at all.

      • Pick a situation that I want to explore, do background research, and then decide how to record the data/process it in a way that it can later be visualized. Related technologies: d3.js, b3,js, three.js

      • Potentially, I could create a program that scrapes news sites like the Times/BBC/Al-Jazeera/Washington Post for stories and connections between actors. The program could continuously record connections between actors. The data could then be used to allow users to choose an actor and then visualize connections between the actor and others.

    • Something the class could do:

      • Think of a news story, think of connections / key actors they would want to portray in the news story, and try to draw out a "concept map" of the issue they want to address. Then, try to come up with a way to record the data of those relationships and visualize it in 3D.

  • Expanding Twitter Visualizations

    • This project could expand existing Twitter network visualizations to track trends chronologically. For example, one could pick a meme or a hashtag and track its network over time.

    • Three things to do:

      • Look at existing Twitter visualizations and pick one to build off of, figure out which parts are most useful from each

      • Figure out how to add chronological components to Twitter visualizations (how to modify visualization/data in order to make this happen, ideally at toggle-able timescales)

      • create a program that could scrape Twitter API for any trend and then track it over time.

    • Class activity:

      • Try out existing Twitter visualizations, talk about their merits and demerits


Project milestone for 2/12:

Project milestone for 2/14

  • complete comparison of different visualizations, write it up,

  • try a unity tutorial

  • pick a style of visualization

Project milestone for 2/21

  • recreate one of the visualizations, play around with it (change terms on server/tweet fetching algo, play around with design)

2/10:

  • Researched VR visualizations and compiled them into a list (2 hrs)

  • Organized comparison of VR twitter visualizations (2hrs)

  • Went to VR lab, played with Vive (1hr)

  • Finished Unity tutorial, played around some (2hr)

Self Evaluation:

  • I think it's clear from my journal that my vision has shifted significantly since the beginning of the class, when I was totally a newcomer

    • It took a while for me to realize what could be feasible and what would be too difficult to accomplish (i.e., an AR app visualizing climate change was probably too ambitious for a first-ever VR project, but a modification of a Twitter visualization is more realistic).

  • It took a lot of watching videos/Googling to understand the potential of VR projects. Once I found some code on Github, it became much easier to understand what was possible for me.

  • Seeing examples of Twitter visualizations and the code made it much easier to understand how it would be possible/what tech I would need to learn.

  • Conducting the comparison is a helpful tool for thinking about how I want to move forward, but there is more information that I think would require user testing, such as what do these visualizations actually help with/what does the user get out of each of them

  • The milestones seem attainable, as long as I keep up with them. I am keeping up pretty well now, so I'm not super concerned with achieving them (at least the upcoming three milestones).

  • Room for Growth:

    • I could be contributing more to the Wiki, but I'm not totally sure where the info I'm finding would fit. Maybe I need to make a separate page for this stuff? Or just keep it on my page?

    • Not all of my work is easy to document/quantify, such as playing with the Vive or Unity, but I think a lot of this has to do with how new the tech is to me. Once I get more acquainted it will be easier for me to contribute/quantify my work.

2/20:

  • Finished Unity tutorial (1hr)

  • Re-created this Twitter viz. Had some trouble with visualization (1hr)

    • I liked the fact that it was basically in react and three.js, which made it easy for me to troubleshoot and understand.


2/24

  • Ran demo of this visualization (1.5 hr)

    • Wasn't able to get it to work on Oculus, but was able to populate the data in realtime via python

    • need help getting it to work on Oculus.

  • Played around with unity tutorial on headsets more (1hr)


2/25

  • Finished presentation ppt (1hr)

2/27

  • Added to visualization comparison (1hr)



4/1

  • edited Tweet Towers stats/metrics based on user input (2hrs)

  • added floor plane (1hr)

4/2

  • played around with cursor/reticle (2.5hrs)

4/3-4/10

  • tried implementing reticle (4hr)

  • looked into other reticle examples (1hr)

  • looked into twitter metrics to use for qualifying incoming tweets (1hr)

4/10-4/17

  • discussed and evaluated potential Tweet metrics, implemented metrics (2hr)

  • researched raycaster examples (2hrs)

  • tried implementing raycaster (2hrs)

4/18-4/25

  • created and styled homepage (2 hrs)

  • created proof of concept (1.5 hrs)

  • user testing: proof of concept (4 hrs)

    • showed proof of concept to 5 users from previous study; engaged in conversation about concerns and pain points

4/28-5/4

  • adjustments to Tweet metrics based on proof of concept feedback (1.5hr)

  • tried to implement raycaster and reticle (2hr)

  • set up hosting on heroku (1hr)

  • tested controls --> realized that device orientation wasn't working --> tried to implement orientation controls (2hrs)

    • this was weird, because the original repo said that device orientation controls worked; i couldn't get them to work on my device, which meant that it still required touch to navigate

    • I had a lot of difficulty figuring out how to fix this broken feature; unclear what next steps are...

5/4-5/7

  • Created presentation and poster (1.5hrs)

  • journal entries as well as section on social media data in Data Types as well as VR + Social Media section (2hr)


VR comparison