This project will improve the simulation of musical concert listening and ensemble performance experiences in VR. Innovation relies on the integration of improved computational methods for acoustic scene rendering, and improved empirical methods to measure subjective ‘presence’. The project will work towards a biofeedback system allowing controlled regulation of humans’ behavioural, cognitive and affective responses within (musical) VR environments.
The objective of this project is to develop an interdisciplinary scientific framework, that allows to simulate, in VR, aesthetically rich experiences of musical concert listening and performance, in multi-user environments. The research will focus on four sub-goals;
- Improve the current state-of-the-art of the computational rendering of acoustic scenes in VR. This innovation is situated in combining optimised sound source positioning and room acoustics, within dynamic multi-user environments.
- Develop a new empirical procedure to objectively measure, model and recognise the subjective feeling of presence, which is foundational to the user experience in VR. This will take into account behavioural, cognitive and affective dimensions of the experience.
- Develop a biofeedback control system that balances the accuracy of the auditory rendering of musical VR environments, in view of optimising psychological presence in multi-user experiences.
- Design of empirical experiments to test the efficacy of the biofeedback control system in its ability to modulate users’ behavioural, cognitive and affective responses towards optimal states.
This research will be performed by an interdisciplinary team of researchers affiliated with the Institute of Psychoacoustics and Electronic Music (IPEM) and IDLab-MEDIA, both members of the Art and Science Interaction Lab consortium.