Created by Yanmi Yu, May 2025
AIive enables users to manipulate neural network parameters with virtual hands, providing auditory feedback for real-time values. It integrates visualization, sonification, and direct manipulation in VR, offering an artistic and intuitive representation of AI.
This project presents a neural-network-based binaural sound propagation method for VR and AR applications. It generates acoustic effects for indoor 3D models, enhancing the realism of virtual environments.
This research introduces a real-time auralization pipeline that leverages three-dimensional Spatial Impulse Responses for multimodal research applications in VR requiring first-person vocal interaction.
A scoping review by Bosman et al. (2023) examined 121 studies on audio's role in head-mounted display (HMD)-based virtual reality (VR). Findings indicate that while audio can significantly influence affective, cognitive, and motivational aspects of user experience, its impact on presence is inconsistent, and research on social experiences remains limited. The study highlights the need for standardized measures, especially concerning pleasantness, and calls for more descriptive research to fully understand audio's potential in VR environments.
The study highlights how advancements in hardware and software have enhanced the development of interactive soundscapes, emphasizing their role in enriching user experiences. It discusses various methods for combining sound and VR, including spatial audio techniques and music integration, and provides examples of musical VR projects to illustrate these applications. The review also addresses the challenges in achieving a balance between audio realism and computational efficiency, concluding with insights into future directions for sound design in VR.
This study investigated how ambient nature soundscapes and movement-triggered footstep sounds affect the sense of presence in virtual reality (VR). In two experiments, participants walked on a treadmill while experiencing a virtual park through a head-mounted display. Experiment 1 found that ambient soundscapes significantly enhanced presence and realism, whereas footstep sounds had minimal impact, likely due to low perceived synchrony. After improving the step-sound synchronization algorithm in Experiment 2, both audio elements contributed to increased presence and realism, with ambient soundscapes having a more substantial effect. These findings highlight the importance of well-integrated audio cues in enhancing VR immersion.
A study by Kamal et al. (2023) investigated how task difficulty in a virtual reality (VR) working memory (WM) environment affects the passive processing of irrelevant auditory stimuli. Sixteen young adults engaged in either an easy WM task involving nameable objects or a difficult task with abstract, non-nameable objects, all while wearing a 3D head-mounted VR device. During these tasks, irrelevant auditory stimuli were presented at intervals of 1.5 or 12 seconds. Event-related potentials (ERPs), specifically N1 and P2/P3a components, were measured to assess auditory processing. Findings revealed that ERP amplitudes were larger when auditory stimuli were presented slowly, regardless of task difficulty. However, higher performance on the easy WM task correlated with smaller ERP amplitudes, suggesting that even less demanding VR tasks can consume significant cognitive resources, thereby diminishing the processing of irrelevant auditory information. This highlights the immersive nature of VR and its potential to limit awareness of external auditory stimuli.
Researchers at the University of Oldenburg are utilizing virtual reality (VR) to investigate complex auditory perception processes. Through the AUDICTIVE program, funded by the German Research Foundation, they explore how VR can simulate real-life acoustic environments to study auditory cognition, attention, and memory. One project examines the realism of room acoustics in VR and the influence of visual perception on auditory experiences, aiming to understand how audio rendering in virtual settings affects social anxiety. Another project focuses on how visual cues in interactive audiovisual virtual environments impact auditory attention decoding and cortical tracking of speech, using mobile EEG technology in VR environments. These studies aim to enhance the ecological validity of auditory research by creating immersive, interactive virtual environments that closely mimic real-world listening situations.
Game technology has been widely used for educational applications, however, despite the common use of background music in games, its effect on learning has been largely unexplored. This paper discusses how music played in the background of a computer-animated history lesson affected participants’ memory for facts. A virtual history lesson was presented to participants with different background stimuli (music or no-music) to test the effect of music on memory. To test the role of immersion on memory and its possible relationship to the music, two different display systems (3-monitor display system or immersive Reality Center) were used in the study. Overall, participants remembered a significantly higher number of facts using the 3-monitor display system, particularly if no background music was played in the second half of the history lesson. Conversely, for participants using the Reality Center, significantly higher recall of facts was found when participants listened to music in the second half of the history lesson. Cognitive load/overload and (un-)familiarity with the technology are offered as explanations.