Objective:
Create an immersive AR/VR experience where the music is represented as a 3D staircase, with each step corresponding to a note and a ball that moves along the steps to indicate the note currently being played. The design should allow users to see and compare repeated segments of the melody for both the left and right hands, inspired by visual tools like the Music Animation Machine.
Define the Visual Style:
Gather reference materials (e.g., Music Animation Machine projects) to decide on colors, textures, and overall layout.
Define the visual differentiation between AR and VR environments to support the comparison component.
Determine the representation of musical elements (pitch, duration, velocity) and ensure that the visualization remains consistent between AR and VR.
Wiki deliverable
Establish Requirements:
Decide how the musical data (notes, timing, dynamics) will be encoded into the 3D model.
Define how repeated segments and dual-hand comparisons will be visually differentiated (e.g., using contrasting colors or side-by-side layouts).
Create a New Unity Project:
Set up the project with the appropriate XR settings for Quest 3.
Import Required SDKs:
Add the Oculus Integration package and XR Interaction Toolkit.
Include MIDI parsing libraries (e.g., MidiJack) and data handling tools (e.g., JSON support).
Configure Build Settings:
Ensure the project is set up for Android (for Quest 3) and optimized for performance.
Adjust settings for smooth transitions between AR and VR modes.
MIDI Data Parsing:
Import a MIDI file of the Bach fugue or another musical piece.
Use a MIDI parsing library (e.g., MidiJack) to extract note information such as pitch, duration, and timing.
Mapping Musical Elements to 3D Geometry:
Define how each note’s pitch translates into the vertical position or size of a staircase step.
Determine how note duration might affect the length or width of each step.
Create a data structure (like a JSON or scriptable object) that holds the parsed note data, segregated by left-hand and right-hand parts.
Integrate data analysis techniques to detect repeated sequences and visualize them distinctly.
Staircase Construction:
Develop scripts to dynamically generate a staircase where each “step” represents a note.
Optionally, use procedural generation techniques to adapt the visualization based on the piece’s complexity.
Ball Animation:
Create a ball (using a simple sphere mesh) that animates along the staircase.
Write scripts to update the ball’s position based on the current note in the MIDI playback.
Dual-Hand Visualization:
Create two separate visual tracks (lanes) to display the left-hand and right-hand melodies.
Use distinct color schemes or spatial separation to help users compare the two parts.
Highlight repeated sequences using visual cues (e.g., glowing edges or color changes).
Synchronizing Audio Playback:
Integrate the Unity Audio system to play the musical piece.
Develop a synchronization mechanism that uses MIDI note onset times to trigger ball movements and visual transitions.
Implement synchronization adjustments to accommodate both AR and VR modes.
User Interaction:
Implement controls that allow users to pause, rewind, or change the playback speed.
Use the Quest 3 controllers/hand tracking to let users explore the visualization by zooming in/out or rotating the view.
UI Elements:
Design intuitive UI panels in 3D space for user interaction, ensuring they don’t obstruct the primary visualization.
Prototype Testing:
Run the visualization in the Unity editor and on a Quest 3 development build.
Test the synchronization between audio and visuals in both AR and VR environments.
Implement user testing to compare AR and VR visualizations based on clarity, interaction ease, and user preference.
Conduct the in-class activity:
Present a demo of both the AR and VR versions of the music visualization.
Allow classmates to interact with both setups for a few minutes each.
Ask users to identify repeated musical patterns or differences between left and right-hand melodies in both environments.
Conduct a follow-up discussion and gather feedback on usability and clarity.
Collect data using short surveys to compare user preferences and perceived visualization clarity in both AR and VR.
Performance Optimization:
Optimize the procedural generation and animation scripts for smooth performance on the Quest 3.
Profile and fine-tune the application to reduce latency and ensure a seamless user experience.
Final Adjustments:
Refine the visual details (e.g., lighting, textures) and user feedback mechanisms based on testing results.
Deployment:
Package the application for Quest 3 following Oculus’s guidelines.
Consider user documentation or tutorials to help new users navigate the experience.
Poster/presentation making
The proposed project clearly identifies deliverable additions to our VR Software Wiki.
Answer: 5/5
Explanation: The project will document the setup of the XR rig, procedural generation of 3D content, MIDI integration, and synchronization methods—all of which can serve as valuable wiki additions.
The proposed project involves passthrough or “augmented” in VR.
Answer: 4/5
Explanation: Although the focus is on a pure VR visualization, an optional passthrough overlay is planned as additional comparison.
The proposed project involves large scientific data visualization along the lines of the "Scientific Data" wiki page and identifies the specific data type and software that it will use.
Answer: 3/5
Explanation: The project reframes musical data as a multi-dimensional dataset, treating MIDI files as scientific data with elements like pitch, timing, dynamics, and control changes. While it is not large-scale scientific data, it introduces a data-centric visualization approach akin to scientific analysis. Specifically contain:
Note Information: Pitch (which note), duration (how long), and velocity (how hard the note is played).
Timing Data: When each note starts and ends.
Instrument Data: Specifies the instrument sound (e.g., piano, guitar).
Tempo and Timing: Beats per minute and timing signatures.
Control Changes: Such as volume, pitch bend, or modulation.
The proposed project has a realistic schedule with explicit and measurable milestones at least each week and mostly every class.
Answer: 5/5
Explanation: The project timeline includes clear milestones with dates ensuring weekly progress and measurable deliverables.
The proposed project explicitly evaluates VR software, preferably in comparison to related software.
Answer: 4/5
Explanation: The project now includes an explicit AR/VR comparison to evaluate how each medium impacts visualization and user experience. This comparative analysis enhances the evaluation aspect.
The proposed project includes an in-class activity, which can be formative (early in the project) or evaluative (later in the project).
Answer: 4/5
Explanation: A demo session or in-class walkthrough of the visualization and interactive controls is planned as part of the project. Specifically contain:
music visualization 2D
demo of both the AR and VR versions of the music visualization.
Allow classmates to interact with both setups for a few minutes each.
Ask users to identify repeated musical patterns or differences between the left and right hand melodies in both AR and VR environments.
Follow-up discussion, gather suggestions for improving the visualization or interaction mechanisms.
The proposed project has resources available with sufficient documentation.
Answer: 5/5
Explanation: The project will utilize Unity, the Oculus Integration package, and existing MIDI parsing libraries, all of which have extensive documentation.
Reference
MIDI standard link