Art and Science Interaction Lab

The Art and Science Interaction Lab (“ASIL”) is a unique, highly flexible and modular “interaction science” research facility to effectively bring, analyse and test experiences and interactions in mixed virtual/augmented contexts as well as to conduct research on next-gen immersive technologies. It brings together the expertise and creativity of engineers, performers, designers and scientists creating solutions and experiences shaping the lives of people. The lab is equipped with state-of-the-art visual, auditory and user-tracking equipment, fully synchronized and connected to a central backend. This synchronization allows for highly accurate multi-sensor measurements and analysis.

 

 

The lab itself is a room of 10m x 10m with a height of two floors (~6m), and is acoustically insulated. It is equipped with 82 individual speakers connected to a fully IP-based audio distribution system. The system is capable of delivering highly realistic spatial audio projection by making use of the connected audio wavefield processor (Barco IOSONO). Furthermore, this audio system allows for accurate recreation and simulation of room acoustics. In terms of visual modalities, the lab is equipped with 2 fully untethered HTC Vive Pro Eye and 2 tethered HTC Vive Pro devices, allowing free roaming spanning the full 10m x 10m area and allowing for multi person VR. A 7m x 4m acoustically transparent projection screen in combination with a 12.000 lumen 4K projector delivers compelling and high-end immersive visualizations. Both the audio and visual systems are connected to a powerful state-of-the-art processing backend.

In terms of sensors, the Art and Science Interaction Lab is equipped with a Qualisys motion capturing system consisting of 18 infrared and 5 RGB cameras, delivering high frame rate (>= 120fps) multi-person motion capture with < 1 mm3 accuracy. This system also allows to track fine-granular movements (e.g. fingers while playing piano or face expressions). Furthermore, two untethered clinical-grade EEG headsets allow for dual 64 channel EEG measurements. The shielded cap with active electrodes allows wearing the EEG headsets underneath a VR headset with minimal interference. Lastly, EMG, skin conductance and eye-tracking complete  the multi-sensory measurement system in the ASIL.

 

 

The Art and Science Interaction Lab team supports innovation in different key domains. In general, the team focuses on interaction research in virtualized environments, unraveling complex user interactions and experiences in order to design and create novel applications and interfaces. The application domains span from smart home appliances, health, safety, smart public places to more artistic and creative applications. Furthermore, the infrastructure is a key research infrastructure used for fundamental research on virtual reality technologies (e.g. auralisation, virtual acoustics, 6 degrees of freedom VR, multi-person VR…) and is dark fiber connected to three concert halls in Ghent.

The team is an interdisciplinary consortium joining the expertise of three Ghent University research groups (IDLab, IPEM and mict) and has been co-funded under the medium-scale research infrastructure program governed by the Research Foundation Flanders (FWO). As such, the Art and Science Interaction Lab is a one of its kind research facility targeting both industry and academia and delivers an interdisciplinary approach in measuring, analyzing and creating our next-generation appliances, interfaces and experiences.

 

Detailed technical overview

Interconnectivity

10Gbps dark fiber to ‘De Vooruit’, ‘De Minard’ and the future ‘Wintercircus’ allows for real-time shared experiences between the Art and Science Interaction Lab and three important cultural venues in Ghent. The Art and Science Interaction Lab is also directly connected over dark fiber to an off-site back-up facility.

Applications:

  • Real-time streaming of multi-track audio between facilities enabling (re-)creation of a cultural experience in the Art and Science Interaction Lab as well as creating novel experiences based on real-time audio feeds.
  • Real-time motion capture allows for novel experiences where motion in the Art and Science Interaction Lab can real-time be used in the cultural venues (e.g. the creation of virtual avatars or experiences).

 

Audio infrastructure

An immersive speaker system and a state-of-the-art sound processing backend allows compelling auditory experiences. The entire lab is acoustically insulated, reducing sound reflections and deformations in order to deliver the most accurate auditory stimuli to the listeners.

Infrastructure:

  • 82 calibrated speakers (8 subs, 2 speaker rings and an overhead speaker set)
  • Barco IOSONO core (wavefield synthesis system)
  • Fully IP-based audio backend (Dante)
  • 10 amplifiers including built-in DSP for each individual channel
  • Unlimited XLR and ethernet patch possibilities
  • Fully synced with 48kHz clock
  • Available software/frameworks: Ableton Live, Max/MSP, Ambisonics,…
  • Direct connection to local student radio station Urgent

Applications:

  • Accurate object-based projection of 3D audio in space using wavefield synthesis enables immersive auditory environments, bringing a myriad of application opportunities (s.a. recreation of multi-track performances, interactive sound objects that are movable in space, …)
  • (Re-)creation of real-life or simulated acoustic environments allows one to experience the acoustic properties of a different location (e.g. concert hall, church, outdoors,…) or the simulation of acoustic properties of future buildings or expositions.

 

Video and Mixed AR/VR infrastructure

Our installation caters towards multi-user applications, with the focus on usability and freedom of movement. State-of-the-art virtual and augmented reality headsets are readily available in the lab. Furthermore, a high-resolution projection system is capable of delivering compelling grouped or single experiences.

Infrastructure:

  • 2x HTC Vive Pro Eye untethered VR headsets with built-in eye tracking
  • 2x HTC Vive Pro tethered VR headsets
  • Microsoft Hololens AR glasses
  • 7m x 4m acoustically transparent projection screen
  • 12.000 lumens 4K projector with active stereo 3D and Extended Dynamic Range
  • high-end rendering compute power with latest NVIDIA GPUs

Applications:

  • Delivery of immersive and interactive virtual and augmented reality experiences while maintaining the highest degree of free movement possible. The processing backend allows for real-time rendering and visualization of complex 6 degrees of freedom experiences (6DoF).
  • Multi-user VR applications allow multiple users to interact naturally in a shared virtual environment.
  • High resolution, high dynamic range projection allows for immersive context creation without the use of a virtual reality headset.

 

Motion capture

A large Qualisys motion capture setup allows full body tracking of multiple users with an accuracy of < 1mm3, on a 81m2 floor area and a 5m configurable volume height. This allows detailed tracking of (multi) user movement. Furthermore, real-time integration with both the audio and visualization solutions allow for interactive audiovisual experiences based on human movement.

Infrastructure:

  • Qualisys Oqus 7+ infrared cameras (14 fixed+ 4 mobile units)
  • Qualisys Miqus Video camera’s (4 fixed + 1 mobile unit)
  • Synced with the ASIL 120Hz clock signal
  • High-end real-time processing backend allowing for real-time skeleton tracking and streaming
  • Tracking compatible with VR headsets

Applications:

  • Accurate measurements of human behavior (motion) in interactive scenarios.
  • Possibility of fine-granular motion measurements (e.g. finger tracking while playing an instrument or face expression tracking)
  • Real-time full avatar creation in VR by mapping full skeleton data in real time to digital avatars.
  • Accurate 6DOF tracking of objects

 

UX tracking sensors

The Art and Science Interaction Lab is the go-to facility for interaction and user-experience research. The infrastructure provides a wide variety of synchronized sensors capable of measuring different aspects of an experience and is backed by a strong team of analysts. This allows the detailed unraveling of user experiences and interactions.

Infrastructure:

  • 2x clinical grade untethered EEG headsets (64 channels with active electrodes), which can be used in combination with our wireless VR headsets.
  • Eye trackers, both built into VR headsets and standalone.
  • Skin conductance sensors
  • Heart rate sensors
  • EMG sensors
  • Synced with the ASIL 120Hz clock signal

Applications:

  • Measuring physiological signals
  • Unravel physiological processes within human interactions and experiences
  • Creating novel experiences with real-time biofeedback

Selected posts related to this infrastructure:

  • ‘ExperienceDNA’ – we design the perfect user experience for your future innovations
    In an ever-digitizing society, we face a new wave of smart products. Interfaces are shifting, leading to a complex interplay of interactions people are not always aware of. To simulate these scenarios and automatically unravel the DNA of these new experiences, we developed a new tool called ‘ExperienceDNA’.  Discover everything you need to know in the ...
  • Funding granted for Interdisciplinary Research Project on improving musical group experiences in VR, using a biofeedback control system
    This project will improve the simulation of musical concert listening and ensemble performance experiences in VR. Innovation relies on the integration of improved computational methods for acoustic scene rendering, and improved empirical methods to measure subjective ‘presence’. The project will work towards a biofeedback system allowing controlled regulation of humans’ behavioural, cognitive and affective responses ...
  • Funding granted for imec.icon project ILLUMINATE
    ILLUMINATE – Interactive streaming and representation for totally immersive virtual reality applications Recent breakthroughs in capture and display technologies are leveraging highly immersive Virtual Reality (VR) applications. These emerging applications offer more Degrees-of-Freedom (DOF) to the users and thus make the experience much more immersive compared to traditional 2D visual content. This is largely pushed by ...
  • Best Poster Extended Abstract Award at 16th International Conference on Human-Computer Interaction 2014
    We are pleased to announce that we won the Best Poster Extended Abstract Award at the 16th International Conference on Human-Computer Interaction Poster paper can be downloaded here: BilliARt – AR Carom Billiards, Exploration of an AR Framework This paper presents a framework for processing visual and auditory textures in an augmented reality environment that enables ...

View all posts