What will music creation and experience look like in the future? And what role will immersive technologies (XR, VR, 3D audio, virtual production, avatars, game engines, 3D scanning, etc.) play in that future? These are the central questions in a new research project by PXL-Music Research (PXL, Hasselt) and Digital Arts & Entertainment Research (Howest, Kortrijk).
Over the next three years, both research institutions aim to focus on these topics in close collaboration with stakeholders from the music and broader entertainment industries. Which tools help artists and their teams build a new production from A to Z on the canvas of 3D space? A few key concepts are central to this:
Artists and their ecosystem:
This proposal addresses the entire value chain of an artist’s career. This includes creative and performing musicians, labels, managers, bookers, publishers, venues, production partners, etc. The project covers all sectors of the music industry (recorded, live, copyright). We’re also looking broadly at partners in the entertainment industry, such as art directors, stage and light designers, motion designers or 3D illustrators, technical suppliers, event organizers, and so on.
New production:
This term has a broad scope. If we assume that immersive audio will become the standard in the coming years, how should new releases be prepared across all phases—from composition to recording, mastering, and distribution? How should live shows be designed to offer audiences the extra experience that creates a memorable evening? How can marketing campaigns use immersive technologies like XR, VR, AR, AI, and 3D scanning to enhance fan engagement, generate more streams, and sell more tickets?
3D space:
This concept is crucial to the research project. We start from the belief that music creation and experience in the future will inevitably happen in 3D. First, our auditory perception of sound is inherently three-dimensional. The way we’ve grown used to listening to music (mainly left-right in stereo) is not natural. An immersive recording and listening experience aligns much better with how our hearing naturally works.
Additionally, our visual perception of our surroundings is also inherently 3D. So why is this so rarely reflected in the design of live shows? Visual support at concerts too often remains limited to ‘flat’ 2D visuals that are pre-programmed and don’t ‘interact’ with the music.
Immersive (3D) technology can also play a crucial role in artist-fan relationships, allowing artists to be closer to their fans (or vice versa) than ever before—without adding pressure to the artist’s schedule or private life.
Based on these insights, we plan to conduct research over the next three years and build use cases around various immersive technologies. This builds on the findings from the TETRA research project “Virtual Music Experiences” that ran from 2022 to 2024. That project developed several applications offering an initial glimpse into the wide range of possibilities immersive technologies bring to the music industry.
From an AR experience based on a song by the post-metal band Psychonaut, to XR controllers for live performances, to fully visual and acoustic digital twins of concert halls or live shows with immersive audio and visuals—these experiments gave us valuable insights into both the applicability and complexity of these technologies, as well as the music industry’s interest and willingness to adopt them.