This experience explores the idea of natural music creation in XR through the use of hand tracking and physics-based interactions.
The ultimate goal is to create a product that seamlessly integrates musical instruments in a way where there is little to no learning curve. By doing away with the controllers, anyone can instinctively play melodies and explore sound design through various knobs and effects they can control directly with their hands.
This demo shows the project in its current state. Some of the most recent things I've been exploring is the integration of interactive shaders and visual effects to give rewarding visual feedback as another layer to the interaction.
The initial concept was to create a digital clone of one of my physical synthesizers in XR. Many analog synths can be fairly expensive, so I thought it would be great to be able to achieve a similar experience and sound in XR that could be more accessible for more people.
One of the best parts of a physical synth is it's tactile nature. Because of this, I knew I wanted to explore hand tracking to get as close to the real thing as possible.
To make the synth work with physics-based interactions, the first step was creating a few keys in Unity using rigidbodies, colliders, and joints to get the behavior I wanted.
From there, I worked on creating knobs that could be mapped to essential controls such as the low pass filter, envelopes, and effects through a custom C# script.
With the basic building blocks in place, I assembled a whole octave of the keyboard so I could test how well the hand tracking was working to play different notes.
Beyond they keyboard and knobs, the other critical component to my prototype was creating a signal flow that would route my audio appropriately and give the user the same level of control over the sound design that one would expect in any synthesizer.
This involved a combination of Unity's built in audio mixer and effects with some of the more specific functionality such as envelopes incorporated though my own scripts.
To simplify the process, I decided to use a looped samples instead of true oscillators, and I kept the synth in mono. I also only used one oscillator (a saw wave).
To make the experience more enjoyable, I opted to rescale the keyboard in a way that seemed to greatly reduce the errors that were being made while playing it. This rescaled synth felt more like hitting a set of drums than a keyboard, but it was much easier to play, and as a result, was much more fun.
Although it was now much easier to hit the correct keys, another usability issue I was running into involved the overall physics set up. I was getting some strange behaviors and unwanted interactions between my hands, the table, and the keys, however though some additional testing, I was able to reconfigure the colliders in Unity and get back to a setup that behaved as expected.
With the major usability issues out of the way, I was able to finally build a full setup that I could easily play. For the first time, I could hit all the notes I intended to, and the synth started to sound much more musical as a result.
To refine the prototype further, I focused on creating a more inspiring environment that could respond visually to the audio and the physical interactions with the synth. One of the main parts of this was a shader and particle effect that made the keys glow when hit. I used other shaders to complete the look of the synth and environment in a similar style.
As a final touch, I added the ability to play a drum loop in the background that creates a rhythm to jam to and serves as the backbone for a song. In the future, I'd love to integrate a sequencer that can make this part of the experience more customizable and interactive.
In testing the build shown above with a few other people, it's clear there are still improvements to be made. For a larger person like me (I'm 6' 3"), all notes are within reach, but for shorter people, it's more difficult. This is also approaching the limit of the range in which hand tracking works.
On top of that, there's a noticeable amount of latency that has an effect on how well a user is able to stay in time with the beat, and there's a number of improvements I could make to improve the sound as well.
Although there are still many improvements that can be made, I've learned a lot throughout the testing process so far, and I'm excited to apply those learnings towards the next iteration.