The Art & Science Interaction Lab is a modular research facility to effectively realize, analyze, and test AR and VR experiences. It is located in the brand-new building ‘De Krook‘ in the city center of Ghent. The 10x10x6m room is equipped with state-of-the-art audiovisual equipment, motion tracking and two EEG headsets. This infrastructure enables investigating, bringing and measuring a multitude of interactive AR and VR experiences.
Art & Science Lab – 360°
The Art & Science Lab is a collaboration between three Ghent University research groups (IDLab-MEDIA, IPEM and mict) and is funded under the medium-scale research infrastructure program governed by the Research Foundation – Flanders (FWO).
The Art & Science Lab serves as a central hub between several concert halls in Ghent. In more detail, the facility is connected with 10Gbps fiber to ‘De Vooruit’, ‘De Minard’ and ‘Het Wintercircus’ enabling interaction between those facilities. Such connection enables, amongst others:
- Real-time streaming of multi-track audio from one facility to another (e.g., enabling a real-time experience in the Art & Science Lab based on a live stream from one of the concert halls);
- Real-time motion capture interactivity between facilities (e.g., creating interaction between two facilities by generating real-time avatars based on real-time motion capture);
Furthermore, the Art & Science lab is connected directly towards the Ghent University data center, enabling fast offloading of real-time data for storage or cloud-based processing.
Audio wave field synthesis
Within the Art & Science Lab, a dedicated audio wave field synthesis system has been installed and calibrated. This system allows to project audio into 3D space by use of wave field synthesis. Such a system brings the possibility to (re-)position each audio track within a multi-track recording in real-time.
The system consists of:
- Barco IOSONO core, conducting the wave field synthesis,
- 64 digital channels,
- 62 speakers,
- 9 amplifiers,
- acoustic insulation
Qualisys motion capture
A high-end Qualisys motion capture system allows (full) body tracking with infrared markers and cameras. The cameras can be re-positioned within the room to fulfill any kind of motion capture demand. Practical applications include, but are not limited to:
- full body skeleton tracking;
- high detail tracking (e.g. track fingers of a guitar or piano player);
- rigid body tracking?
The current motion capture system consists of:
- 10x daisy-chained Qualisys Oqus cameras;
- integration with Barco IOSONO;
- integration with Unreal CGI engine.
In the future, the amount of cameras will be increased to allow for a denser coverage and less camera occlusions.
Within the Art and Science Lab, two high-end (mobile backpack) EEG systems allow to track and measure the impact of stand-alone as well as interactive experiences on the human brain as well as to create unique immersive experiences based on real-time biofeedback.
Practical applications include, but are not limited to:
- Sonification by means of real-time EEG biofeedback
- Synchronized measurement of movement and neurophysiological activity for interactive (audio, VR, AR) experiences
The current EEG system consists of:
- 2x synchronized high-end 64 channel EEG systems
- 2x mobile EEG extension allowing free roaming
Compelling 4K projection
Projection will allow high-end immersive visualization within the Art & Science lab. The ultimate goal is to immerse people into the environment by triggering all senses, of which vision is of utter importance.
The visualization setup will consist of:
- acoustically transparent projection screen;
- high-end 4K projector;
- high-end GPU rendering enabling real-time visualizations.
Selected posts related to this infrastructure:
- Funding granted for imec.icon project ILLUMINATE July 13, 2018 ILLUMINATE – Interactive streaming and representation for totally immersive virtual reality applications Recent breakthroughs in capture and display technologies are leveraging highly immersive Virtual Reality (VR) applications. These emerging applications offer more Degrees-of-Freedom (DOF) to the users and thus make the experience much more immersive compared to traditional 2D visual content. This is largely pushed by ...
- Best Poster Extended Abstract Award at 16th International Conference on Human-Computer Interaction 2014 June 27, 2014 We are pleased to announce that we won the Best Poster Extended Abstract Award at the 16th International Conference on Human-Computer Interaction Poster paper can be downloaded here: BilliARt – AR Carom Billiards, Exploration of an AR Framework This paper presents a framework for processing visual and auditory textures in an augmented reality environment that enables ...