XR Music Toy

Extended Reality (XR) is rapidly emerging as a transformative technology, gaining traction across industries—from entertainment to education. With major players like Apple investing heavily in XR hardware, such as the Apple Vision Pro, the potential for immersive experiences continues to grow.

At the VMX Project, we asked ourselves:
“What role can XR play in live music shows?”

To explore that question, we launched the XR Music Toy project—a research initiative focused on experimenting with XR in the context of musical performance. Our goal? To prototype small, creative ideas that merge music and extended reality in compelling ways.

 

Breaking Down the XR Music Toy

The XR Music Toy is built around three core components:

1. Unity-based XR Application

At the heart of the system is an XR application developed in Unity, one of the most accessible and widely used game engines for XR development. Unity allows for rapid prototyping, making it ideal for our experimental approach.

The application runs directly on an XR headset, enabling interactive and immersive experiences in real-time.

2. OSC Communication Protocol

To enable communication between the XR headset and external devices, we use Open Sound Control (OSC)—a protocol known for its low latency, high precision, and flexibility. OSC is already popular in professional lighting and audio installations, making it a natural fit for XR-enhanced music performances.

3. Audio Processing with PlugData and Reaper

Audio is processed using PlugData, a visual audio programming environment, alongside Reaper, a digital audio workstation (DAW). This setup allows us to combine real-time data from physical instruments with OSC signals from the XR system to create dynamic and expressive audio effects.

 

Blending the Physical and Virtual Worlds

XR enables a seamless blend between the real and virtual worlds. Using technologies like eye tracking, hand tracking, and spatial mapping, performers can interact with virtual elements in intuitive and expressive ways.

Example 1: Eye-Controlled Sampling

In this prototype, eye tracking is used to detect where the performer is looking. When the user looks at a virtual block and strums their guitar, a sample is triggered. This creates a subtle yet powerful interaction between physical action and virtual response.

Example 2: Immersive Visual Worlds

Here, the virtual world becomes a canvas. Using Unity, we can render imaginative visuals—lasers, planets, floating objects, and more—that respond to music or user interaction. This unlocks limitless creative potential for live shows and installations.

Example 3: Fully Virtual Instruments

In this case, no physical instrument is needed at all. The instrument itself exists entirely in XR, and the user interacts with it through hand gestures or gaze. It opens up new paradigms for musical expression where hardware constraints disappear entirely.

This article belongs to the following project:

Virtual Music Experiences

In the international music industry, there has been a significant growth in experiments with new technology in recent years. For...