Subjective orientation in VR audio
  This project will involve establishing a method for capturing listeners' experiences of spatial audio in VR. Whilst wearing a Head Mounted Display (HMD), sounds will be presented to listeners who will be asked to indicate the relative position in both orientation and depth. This information will provide an insight into how effective existing Head Related Function Transfers (HRTFs) are for binaural presentation, as well as help assist in identifying approaches that are effective for communicating auditory spatial cues. A VR testing environment and an initial method has already been developed in Japan by Dr Julian Villegas from the University of Aizu, and this funding would be used to establish the approach’s reliability and extend the technique further in preparation for a joint grant application. Traditional methods of capturing spatial listening experiences have focused on physical anatomy, in the erroneous belief that hearing is identical to listening. There are many factors that affect auditory spatial perception, from levels of listening, reproduction hardware, noise floor, presbycusis, general health, cognitive load, levels of distraction, type of content and even the choice of task, amongst many others. Having the ability to identify where listeners perceive a sound emanating from can also utilised for diagnosing hearing impairments as well as head injuries. The technique could be important in emergency situations where spatial identification of sound sources is essential such as in the case of firefighting, or other evolving life-threatening scenarios where orientation is key. Both VR simulations and AR support systems could be tested under different cognitive loads to develop novel forms of sonification to improve reaction times, spatial orientation, and navigation.

  • Start Date:

    1 June 2022

  • End Date:

    28 August 2023

  • Activity Type:

    Externally Funded Research

  • Funder:

    Arts & Humanities Research Council

  • Value:


Project Team