Interactions for VR Music Instruments
This article was originally published on my personal website.
First, I have to admit that I do have a basic understanding of music theory. I finished a seven-year piano course. Also, I’ve handled guitars and the ukulele but never played a full song. Yes, I’m ashamed of it. Anyway, I consider understanding the order of notes as a good foundation to build VR music instruments.
My main goal was to explore hand interactions. Also, I wanted to see how receiving immediate pleasurable sound feedback affected the experience as a whole.
To make the entire process efficient, I had to sacrifice something. I knew that I would never release or share my Unity project. Therefore code and whole set up were only good enough for testing and iterating faster. I decided I would not make any custom models and use only default 3D objects from Unity for all the scenes.
With these limitations in mind, I started building my first test. It was supposed to be some figures that were arranged like piano keys and made sounds once I touched them. I started using basic cylinder objects just as placeholders, but once I found bells sound on this website, I knew that cylinders from some magical metal would make exactly such sounds.
After a few tests and troubleshooting, I had the first prototype. As usual, I don’t play my experiences a lot. I’m simply testing, finding things that can be improved and come back to Unity. This time was slightly different. I spent quite a lot of time learning simple songs for piano from YouTube and trying to play them on my virtual instrument.
The first takeaway was related to the precision of finger tracking. I haven’t expected to make actionable items so close as guitar strings, but even real piano key sizes had a lot of errors, so I decided to reject the idea to make it compact and don’t care about scale.
Lighting and object on touch feel quite cool and natural. For me, such details are like the hover effect of clickable elements in a flat UI. It’s something that you don’t explicitly notice, but once you don’t have this feedback, you feel that something is off.
Adding physics and making cylinders hanging made the experience more realistic. It’s simple and expected behaviour for such elements that only emphasize the whole musical experience.
To add a little bit of magic, I added some shiny particles that are floating around. It’s not something that changed a lot principally, but it definitely gave this simple experience a more finished look.
My main takeaway was:
It’s so much fun to play with!
My next idea was to test more interesting interactions. Using physics and joints in Unity, I decided to attempt to replicate guitar strings. After some unsuccessful tries, I ended up with a version that felt fine.
For me, it looked like drops of water on a spider’s web. I wasn’t able to stand and quickly made it look like a night forest with moonlight. Such basic changes improved the whole experience. It proved that adding a simple story is powerful.
I tried to make the active component as large as it was logically possible, but I had minor issues with tracking. It still was so much fun.
I also added tricks that have an immediate effect on users. Lights and particles make the experience more natural.
I liked how changing the environment changed the whole experience with the music. I decided to play with it a little bit more and make some epic instrument. For this one, I modelled a huge cave from cubes in Unity, used sounds that were suited to such a scale, and played a little bit with the lighting effects.
The scene was still basic but so much fun. I spent so much time learning different songs that my shoulders hurt. At this moment, I realized that the ergonomics of my instruments really suffered. And there is a real reason why real instruments have such shapes.
I liked the epicness of the scene so much that I wanted to elevate it even more. I decided to build what I called the mega bass — something like drums, but really, really huge. When a user presses the small levers, huge hands or sticks beat huge drums that are in the valley. I planned to add something like an elevator that takes the user to a height to see all the drums from above.
After a few experiments, I realized that this idea was conflicting in its essence. To deliver the epicness of the drums and levers and show their large scale, they should behave like large objects. The huge hand should travel a much longer distance than the small one to punch a drum. I already had a minimal gap between the touch and the sound produced. But I have to admit that it’s noticeable only on a video. So, adding additional seconds of delay simply ruined the whole experience.
After identifying this dead end, I decided to go in the opposite direction. What if, instead of being far and large, the drums were small and literally on one’s fingertips?
It was definitely the most straightforward scene. But probably the most fun to play. Unfortunately, I have no clue about drums and therefore wasn’t able to play any recognizable melody.
You would expect that at least such an experience would be ergonomically perfect. Right?
It’s really not.
I spent a lot of time jamming. I learned to make each gesture as clean as possible so as not to confuse hand recognition. Also, for the best tracking, I had to face my palms toward me. Not a big deal on its own, but when you do it for more prolonged periods, it becomes an uncomfortable position. Eventually, I placed my wrists on the table, but anyway I always felt the discomfort of twisted wrists after long jam sessions 😅
So, to recap my takeaways:
1. Ergonomic is important
2. Instant feedback is mandatory
3. Music in VR is fun
4. Environment is effective