Immersive Experiences

IDLab-MEDIA investigates the fundamental shortcomings present in immersive experiences. Research in this area covers the following aspects:

  • Representation and streaming of immersive scenes using both standardized streaming technology and innovative light representations;
  • Visual quality of immersive equipment;
  • The interaction between user behaviour and quality perception in interactive installations;
  • The Art & Science Interaction Lab, a modular research facility to bring, measure and test AR and VR experiences. The room is equipped with state-of-the-art audiovisual equipment and motion tracking enabling us to (re-)create a multitude of AR and VR experiences.
  • 2021 IEEE Multimedia Prize Paper Award for SMoE Light Field Technology July 9, 20212021 IEEE Multimedia Prize Paper Award for SMoE Light Field Technology
    We’re happy and honored to announce that our light field technology paper “Steered Mixture-of-Experts for Light Field Images and Video: Representation and Coding” has been rewarded with the 2021 IEEE Multimedia Prize Paper Award. This is an annual award for one original paper in the field of multimedia published in the IEEE Transactions of Multimedia ...
  • ‘ExperienceDNA’ – we design the perfect user experience for your future innovations March 24, 2021'ExperienceDNA' - we design the perfect user experience for your future innovations
    In an ever-digitizing society, we face a new wave of smart products. Interfaces are shifting, leading to a complex interplay of interactions people are not always aware of. To simulate these scenarios and automatically unravel the DNA of these new experiences, we developed a new tool called ‘ExperienceDNA’.  Discover everything you need to know in the ...
  • Funding granted for Interdisciplinary Research Project on improving musical group experiences in VR, using a biofeedback control system June 18, 2020Funding granted for Interdisciplinary Research Project on improving musical group experiences in VR, using a biofeedback control system
    This project will improve the simulation of musical concert listening and ensemble performance experiences in VR. Innovation relies on the integration of improved computational methods for acoustic scene rendering, and improved empirical methods to measure subjective ‘presence’. The project will work towards a biofeedback system allowing controlled regulation of humans’ behavioural, cognitive and affective responses ...
  • Paper accepted in MTAP: Random Access Prediction Structures for Light Field Video Coding with MV-HEVC May 26, 2020Paper accepted in MTAP: Random Access Prediction Structures for Light Field Video Coding with MV-HEVC
    Our article with the title “Random access prediction structures for light field video coding with MV-HEVC” was published in Multimedia Tools and Applications. The article focuses on exploring prediction structures that can enable random access when compressing light field video with MV-HEVC. Light field video promises to deliver the required six degrees of freedom (6DoF) ...
  • PhD: Towards Image-Based Virtual Reality using Kernel-Based Representations May 1, 2020PhD: Towards Image-Based Virtual Reality using Kernel-Based Representations
    On April 28th, 2020, Ruben Verhack (IDLab-MEDIA) successfully defended his joint PhD in an entirely virtual setting. This joint PhD is a collaboration between the Communication Systems Lab, TU Berlin and IDLab, Ugent. This research was performed under the supervision of Prof. Peter Lambert (Ugent) and Prof. Thomas Sikora (TU Berlin) Steered Mixture-of-Experts for ...
  • PhD: Light Field Image and Video Coding for Immersive Media March 18, 2020PhD: Light Field Image and Video Coding for Immersive Media
    Belgium declared a lockdown for the entire country as a result of the COVID-19 crisis on Wednesday (18 March), following the example of several European countries. A couple of hours later, Vasileios Avramelos from IDLab-MEDIA group was the first member of our faculty to publicly defend his PhD thesis in a 100% virtual online setting. ...
  • Paper accepted in IEEE Transactions on Multimedia: Representing and Coding Light Field Images and Video October 14, 2019Paper accepted in IEEE Transactions on Multimedia: Representing and Coding Light Field Images and Video
    Our paper, “Steered Mixture-of-Experts for Light Field Images and Video: Representation and Coding“, has been accepted in IEEE Transactions on Multimedia Key observations: Introduction of a novel representation method for any-dimensional image data embedded in a strong Bayesian framework Multiple short conference papers have been presented on the subject, but no full paper was yet published. Extra novelties ...
  • Article accepted for publication in the Journal of Real-Time Image Processing: Pixel-level parallel rendering for images and light fields December 9, 2018Article accepted for publication in the Journal of Real-Time Image Processing: Pixel-level parallel rendering for images and light fields
    Our article with the title “Highly Parallel Steered Mixture-of-Experts Rendering at Pixel-level for Image and Light Field Data” was recently accepted for publication in the Journal of Real-Time Image Processing. In the specific article we describe our novel image approximation framework namely Steered Mixture-of-Experts (SMoE) and its potential capabilities in coding and streaming higher dimensional image ...
  • Paper accepted at SPIE optics + photonics 2018: Light field video coding September 1, 2018Paper accepted at SPIE optics + photonics 2018: Light field video coding
    We are pleased to announce that our paper “Steered Mixture-of-Experts for Light Field Video Coding” has been accepted.  It will be published in the proceedings of SPIE Optical Engineering and Applications (Applications of Digital Image Processing XLI). The paper has been presented in the SPIE optics + photonics conference, as part of the session on ...
  • Funding granted for imec.icon project ILLUMINATE July 13, 2018Funding granted for imec.icon project ILLUMINATE
    ILLUMINATE – Interactive streaming and representation for totally immersive virtual reality applications Recent breakthroughs in capture and display technologies are leveraging highly immersive Virtual Reality (VR) applications. These emerging applications offer more Degrees-of-Freedom (DOF) to the users and thus make the experience much more immersive compared to traditional 2D visual content. This is largely pushed by ...

View all posts