VR music series 3: Trois Machins de la Grâce Aimante by Rob Hamilton

sandris murins
25 composers
Published in
19 min readApr 12, 2021

Read my interview with Rob Hamilton on his piece Trois Machins de la Grâce Aimante. It enables string quartet musicians to utilize VR headsets to play Coretet, a virtual reality instrument that translates traditional string gestures into a digital interface. Rob Hamilton, a composer and researcher, utilizes Coretet to create innovative compositions for virtual-reality double bass, exploring the convergence of sound, music, and interaction in his creative work. Text version of interview is created by Estere Bundzēna.

Can you briefly summarise your piece?

The piece Trois Machins de la Grâce Aimante is a virtual reality string quartet. In many ways, it is like a traditional string quartet and not at all like a traditional string quartet. The piece is performed and displayed in virtual reality with each of the performers wearing a virtual reality head-mounted display and using Oculus Touch game controllers to control VR instruments. These are all versions of an instrument called Coretet, which is a VR instrument system and networking platform that I built specifically for this piece. The instrument itself has been used for other performances and pieces as well.

What is the background of your piece?

For a long time, nearly 20 years, I have been focused on this idea of using advanced gaming technologies in the realm of electro-acoustic and electronic music composition and instrument design. I have been very interested in how to harness the power of these fluid and graphically immersive enactive technology systems to control and create music. I did a series of pieces dating back to around 2005–2008 using more standard game engines, hacking existing games like the Quake III engine and the Unreal Development Kit and the Unreal Engine to create immersive game worlds that were inherently musical.

This piece itself was an attempt to get away from this idea in electronic music that a lot of us are very comfortable with — when using electronics, we can create any kind of sound in any way, with any kind of gesture or control system. On the one hand, it is fantastic and beautiful, it allows us to be extremely creative using technology. On the other hand, it is kind of easy. If you can do anything, there is no restriction, and often with composers and creators, restriction actually increases our ability to be creative. For this project specifically, I wanted to try something I had not done before: I wanted to create a virtual model of traditional instruments, or at least traditional instrument performance practices, to the extent that a performer, who has tacit knowledge — an ability for a certain performance style, or instrument, on a physical, mental and conceptual level — could transfer their abilities to this virtual implementation of the same concept. To such an extent, that they would inherently be better at it than another non-musician. Essentially, the idea was to create a suite of VR instruments descended from bowed string instruments like the violin, viola, cello and eventually a double bass, and build them in such a way that they could co-mingle in virtual space and perform together. I believe there is no better way to do that than a traditional string quartet with four performers, who are very much connected musically, conceptually and physically in the same space.

Watch full interview:

How did you create VR instruments and how do they differ from traditional instruments?

By nature, things are different when they are representations of an actual physical object. To create a playable violin in VR without adding an extra accoutrement of physical devices, we have to create an abstraction of that instrument. In VR, we have no physical tactile interface for our body, we have no string on which our bow will be pressing, and there is no feeling of bow pressure, which itself is the inherent driver of a bowed string sound. Similarly, we have no tactile feel for where our finger is interfacing with the neck of the instrument and which string we are pressing on. Therefore, it cannot be the same thing. We have to take the affordances offered to us in this technological platform of VR and try to find corollaries, a metaphor for which we can use these affordances in VR without physical touch, to drive the parameters of a system that would traditionally be driven with touch. I made some efforts to make the instrument as lifelike as possible. The strings themselves are physical models of bowed strings driven by the Synthesis Toolkit (STK) created by Gary Scavone and Perry Cook. It is a wonderful set of physical models of how a string is actuated using certain parameters. Then I had to come up with different ways to drive that signal that did not have that tactile and haptic touch and feel. What I ended up creating was an abstraction of an instrument. Certain parameters of the instrument feel very lifelike; as I move my hand up and down the neck of the instrument I am controlling a continuous pitch of finger pressure on each string as I make a bowing gesture with my right arm, holding a virtual bow. It feels very much like I am bowing and the gestures themselves translate very well. Those other parameters have to be abstracted, so the model itself borrows from reality but has to differ in many ways.

Watch musical piece Trois Machins de la Grâce:

Watch presentation on Coretet:

How did you create VR instruments and how do they differ from traditional instruments?

By nature, things are different when they are representations of an actual physical object. To create a playable violin in VR without adding an extra accoutrement of physical devices, we have to create an abstraction of that instrument. In VR, we have no physical tactile interface for our body, we have no string on which our bow will be pressing, and there is no feeling of bow pressure, which itself is the inherent driver of a bowed string sound. Similarly, we have no tactile feel for where our finger is interfacing with the neck of the instrument and which string we are pressing on. Therefore, it cannot be the same thing. We have to take the affordances offered to us in this technological platform of VR and try to find corollaries, a metaphor for which we can use these affordances in VR without physical touch, to drive the parameters of a system that would traditionally be driven with touch. I made some efforts to make the instrument as lifelike as possible. The strings themselves are physical models of bowed strings driven by the Synthesis Toolkit (STK) created by Gary Scavone and Perry Cook. It is a wonderful set of physical models of how a string is actuated using certain parameters. Then I had to come up with different ways to drive that signal that did not have that tactile and haptic touch and feel. What I ended up creating was an abstraction of an instrument. Certain parameters of the instrument feel very lifelike; as I move my hand up and down the neck of the instrument I am controlling a continuous pitch of finger pressure on each string as I make a bowing gesture with my right arm, holding a virtual bow. It feels very much like I am bowing and the gestures themselves translate very well. Those other parameters have to be abstracted, so the model itself borrows from reality but has to differ in many ways.

Who commissioned the piece and how long did it take to make?

This Coretet project and the commission for Trois Machins de la Grâce Aimante came from Marko Ciciliani and his group at the IEM (Institute of Electronic Music and Acoustics) in Graz, Austria. Marko is a wonderful composer and researcher and he headed up this project called GAPPP, which focused on gamified musical systems and performance practices. Initially, Marko asked me to come out and view some of the other works that were being created and discuss some opportunities. One day, while walking around the snowy streets of Graz after a session in the studio I came up with the idea for something that I wanted to commit a few years to. The idea was to do something crazy — to work on the networking aspect of performers in VR, which traditionally has been difficult; it is not an easy challenge. I thought to do it by building an instrument, a model that inherently makes sense to put this effort into the networking concept of it. By the time I walked back to the IEM studios, after having many coffees around the city of Graz, I had this fully formed idea in my head, to write a string quartet and build a series of instruments together all as one project for the GAPPP project and to perform it with the performers there. Over the next two years, we realized that. I built the software, and I composed one movement of a three-movement string quartet, including graphical scores that you would see in many contemporary works, a standard graphical score. Over the two years, I rehearsed with performers in Graz and we premiered the piece at the IEM studio in Graz and it was wonderful. Regarding the technologies necessary to put on the piece, it is a fairly hardware-intensive piece. Four performers wear the Oculus Rift headsets, and each one has their own laptop to drive that system because it is necessary with the standard Rift, two tracking towers and Oculus Touch controllers so that they are within a VR space, a network, an audio server and a network game server. Essentially, six computers, a network switch and lots of power.

What is the central message of the piece?

The design of the instrument and the composition of the work were just inherently connected. The composition itself explores the abilities of the Coretet instruments. The Coretet instrument can be set up so that with a finger touch it can be configured. The neck of the instrument is either a continuous fingerboard, similar to what we would have on a cello or violin, but it can also be set so that there are detents or virtual frets that you would find on a guitar or some other instrument. Similarly, those frets can then be moved so that the scale of the instrument neck is not at all what one would expect. The neck can be divided into a simple octave, a pentatonic scale, a whole-tone scale, and a whole number of different configurations. The piece itself was composed to showcase these different configurations of the instrument, leveraging the way that the instrument itself could guide the performers on what notes they could play. To play between that and then pure freedom of expression. They can play any note that they want. All this is guided by a traditional printed graphical score that communicates gesture and density, and the different ways that the performers can play together are learned in rehearsal. The performance is done without a score because, in this case, wearing headsets makes it very difficult without additional technology to read the score itself.

Are there more things you can do with this instrument rather than with a traditional one?

Yes and no. The beauty of our traditional physical instruments is that one single solitary note can fill up an entire room. When dealing with technology systems, while it is very easy to make loud sounds, complex systems and sounds that are generated from a single touch or a single motion, that same idea of capturing attention through resonance and physical attributes of a wooden resonating chamber, is so hard to duplicate. In a sense, I feel like I am always operating at a loss, always chasing that beautiful original sound. The Coretet instruments sound like bowed strings, but they sound like processed electronic bowed strings versus the purity of an instrument.

What are the advantages of this instrument?

The advantage is that the instrument itself has no restriction based on physics. The instrument range can be infinite and there are sections of the piece where the performers actually play above the Nyquist frequency for the current sampling rate of the software. It glitches and non-linear noises are introduced; they are non-controllable. The ability to have these bursts of energy controlled by performers, which are not inherently controllable, is one of the wonders of dealing with electronic instruments. We can blend chance and randomization and non-linear, stochastic processes with control. It is a back-and-forth, and the ability to sculpt both our sounds and the way that instruments are played dynamically in real time is much harder to do with a physical instrument. The piece itself plays off of those abilities.

Is the piece available in video format only?

The piece itself has been played in concert several times. It is performed like a traditional concert. The quartet is arranged on a concert stage and there is a projection of the virtual space behind them, that showcases to the audience at least the two-dimensional view of what is happening in the game space, that all the performers themselves see. The sound is spatialized with one channel per instrument. I have performed the Coretet instruments using more channels than that, which is a whole other interesting world to explore. We have done the piece several times: we have performed it twice for audiences in Graz, with the same ensemble, in Mexico City and the south of France in Marseille by two other wonderful ensembles. The ability for this piece to be performed like a traditional work has already been shown. Due to its technology-heavy footprint, I as the engineer/composer, have to be there to set up the equipment, read the ensemble, let them rehearse, show them how the piece works and work with them to perform it. It is a lot of work, but it works quite well.

What is the sensorial experience for the audience you aimed for?

In these kinds of pieces, interestingly, the audience is forced to focus on two different spaces simultaneously, while listening to one aural space that combines the two. It is an interesting thing that happens when it is the performers, who are in a VR space or just in a game space. In a piece like Trois Machins de la Grâce Aimante they make an effort to superimpose these two spaces. The audience sees four performers and their instruments, seated around in a small circle, in a similar way that the traditional string quartet would. In the physical space, in front of the audience, the performers are seated in the same pattern, but they are actually facing away from one another because they cannot physically see one another. Part of that constraint is because they need to face the towers for the Oculus Rift controllers, but it creates a really interesting concept. In the real world, the performers, while situated near one another, are not looking at one another, so it is a play on their positioning, but then in virtual space, they all see one another just fine. Actually, there are moments in the piece where one of them conducts the others in different gestures, to create simultaneous entrances and attacks. The communication between the performers is never questioned, there is a play between what is seen on the screen and what is presented to the audience in the real world.

What kind of software did you use?

The instrument itself, the VR and the control systems for the Coretet instruments are all built using the Unreal Game engine, which is a common high-end commercial gaming engine used for games like Fortnite, built by Epic Games, so it is a very “battle-tested” gaming environment. The audio for the system is all built using Pure Data, an open-source workflow programming language written by Miller Puckette. The communication between the two systems is done using Open Sound Control, so data from each Coretet instrument is streamed in real-time to the Pure Data audio engine, which then does lots of processing containing the STK models of the bowed strings and several effects and processes that go into the piece.

What was the process of composing?

I think of the composition strategy as being in a way similar to software design. It becomes an iterative process. The instrument in its initial state was fairly full-featured before I started composing, but having the system and learning how to play it are two very different things. As I explored improvising and learning how one could play this instrument that I had built, I started to come up with different gestures and feel which kinds of musical sounds and possibilities would work with the Coretet instruments. Through iteration, I started composing short studies. Those studies themselves started to focus on the different capabilities of the instrument, as if I as a composer had never played a violin; (as if) I picked up the violin without any prior knowledge and tried to compose a piece for it. As I sat and played, I started to figure out different performance strategies and practices that sometimes matched traditional instruments and sometimes did not. While working my way through my instrument, I started to write these studies, and the studies themselves became different sections of what eventually became this second movement of the string quartet, which is what we initially performed in Graz. As I worked on new features of the instrument and explored them as an improviser and performer, I started to get a feel for what real performers might enjoy doing and then I started drawing out my scores.

I think it is fair to say, that if you know a designer or a software engineer who just built an instrument and never played it, it might be the worst instrument ever created. We have to know kind of how these systems are going to work, what kind of gestures, sounds and performance practices are possible and good, whatever good means in this context, before we actually can say this instrument is finished or is mature in any aspect.

Can you imagine the piece done with other technological means or is VR the most important aspect of it?

In some sense, it is. VR is a fascinating technology and while it has been around for such a long time, the commercial push towards VR started with the original Oculus DK1 headsets and put it into the hands of people like me, who would not be working in a lab with a high-end VR system. What you get in VR is a sense of depth, which is something that you cannot really capture using a gamepad and a screen to that extent. The depth of field and the ability to use that third dimension to control the system is very different. The performers can move around the instrument as they choose, I do it in some solo pieces. I am very much moving around the instrument to get different gestures and to create different sounds, to have a different frame of view and range of motion. All of that would be lost if we were not in VR, so the concept of depth is frightfully important when building instruments in VR, it represents a paradigm shift from working in two-dimensional systems.

Did the musicians get to test the instrument as well?

To a certain extent. The primary development and testing of the instrument was mostly me. As part of the GAPPP project, Marko and his team invited me to the studios at Graz several times, where I presented the instruments to the performers. Often though, we would have maybe three days to learn the instrument, rehearse with it and explore it before a concert presentation. The amount of time that the performers had before any given concert to explore the instrument was not enough to feed back into the development cycle, so most of the initial testing of understanding of the instrument just came from me playing with it. I am a performer. I have played many instruments my whole life, but I was never a bowed string performer per se, I was a bad viola da gambist, but I enjoy the instruments. As a composer, I have certain knowledge of bowed string instruments, but not to the extent that we performed them, so I was lucky that many of the assumptions and design choices I made, seemed to translate well for my performers. Through the GAPPP project and the interactions with their performers, I got to test out my ideas many times while the instruments were mature and made changes to them to better accommodate the performers. All the performers in Austria were absolutely wonderful to work with during the whole project as was the GAPPP team.

Have you made other pieces with this instrument?

I have. After the initial string “coretet”, I created a new piece using a version of the instrument that was a double bass. The piece Elegy (Ready, Set, Rapture) was commissioned by a wonderful double bassist named Jeremy Baguyos, whom I have worked with over the years. He premiered it at the International Society of Bassists conference. I perform that piece as a soloist myself too. This piece is a solo work for double bass with an extended range, greater than the cello, the viola and the violin. Typically, I perform the piece with somewhere between 16 and 32 channels of sound, where each string is given its own channel and its speaker in space, and different effects are added to the sound as well. It is an improvisatory piece that also included new features for the Coretet instrument, including plucked strings using a new sound model. Most recently, I have been working on a new iteration of the piece using the base instrument with composer Matthew Goodheart, one of my colleagues at Rensselaer Polytechnic Institute, in which the Coretet instrument is used to drive transducers attached to physical gongs in physical space. We did a bunch of work at the EMPAC centre at Rensselaer, using their ambisonic speaker arrays and this system of driving transducers in real time. There is new work that is currently underway using the Coretet instrument in that manner.

Is this instrument an open source?

In theory yes, but in practice it is very difficult at the moment. It requires work to set up and configure. However, I am happy to work with composers who would be interested. One of my goals would be to get a version of the quartet instrument packaged in a way that would be easier for people to get it on the Oculus store or the Unreal store. Another aspect is that when moving things from an experimental artistic world into an inherently commercial one, a lot of concessions have to be made. The audio engine cannot be a separate piece of software, it has to be bundled into the game engine itself, so that the technical complexities are not there and to make it easy for people to use. It is something that I have begun to work on. The goal is to get this instrument to the extent that you could just download and play it. The full Coretet multi-user setup currently still requires several machines and some technical configuration, so it is not exactly open source, but I am happy to share it.

Did you work in a team?

It was a team of one. I have a dual background in traditional music composition and software engineering. I built the entire instrument myself, but the models were made by my long-term collaborator Chris Platz, with whom I worked together for over a decade. The three-dimensional models that are used to build the quartet instrument in real time come from an earlier piece that he and I created for VR called Carillon. It was a VR performance of a futuristic networked instrument, a concert piece that we have done over the years. The actual models that he created were what I used to recontextualize and build the Coretet instruments. The graphical 3D models were created by Chris, but I do all the programming, design, development and audio.

Would working in a team be easier?

Designing instruments is a fascinating idea, and I would, by no means, consider myself an expert in it. I do like to think of this as digital luthiery, it is similar to the idea of sculpting and creating an instrument, but with very different concerns and limitations than physical luthiery. It is a challenge nonetheless, so the technical skills are perhaps daunting to many. I think many people would approach this type of project with a team, which would work just perfectly, especially people with different skill sets like audio engineering, composing, game design and development. I have been fortunate in life to have played those roles in many different situations over the years, so for me it tends to work very well. I work in small teams to get prototypes up and running and not spend lots of time discussing and planning, but instead trying out new things and experimenting.

What have you learned or discovered while composing the piece?

I think the greatest discoveries of this project have been simultaneously how similar it can feel to a real instrument and yet how far away it is from being that real instrument. It was not something I quite expected. When I first had a playable prototype, I got extremely excited, because it worked. I had to make many choices to track the bowing gestures, so when I felt like I was bowing a string, it was very exciting. Then I realized how incredibly complex that last step was, to make it sound less like an electric violin and more acoustic, beautiful, warm and resonating. The complexity and the issues around it felt overwhelming. I would love to continue to make these instruments more and more flexible, giving them more reality and more features. But all of the details require a huge amount of work. I have already done all the foundational work, so that is exciting. I feel like I was successful in a lot of my initial attempts, but then realizing how much work would be necessary to make this sound feel like a real violin is daunting, to say the least.

Would you say some barriers cannot be overcome?

There are a fair number of barriers to this kind of work, but as technology becomes a bigger part of our lives, younger designers, developers, composers and artists are inherently more comfortable with technology than many other generations. The confluence of these skill sets of people who are comfortable as software engineers and have a depth of knowledge and interest in musical systems will only increase because the technology platforms themselves are becoming a skill set that younger artists pick up naturally. They do not necessarily have to decide to be a software engineer as computers take over every aspect of our lives, but they are generally more exposed to it than earlier generations. I think we will see more artists interested in these worlds whether it is gaming, VR, augmented reality or other XR systems that are still coming down the pipe.

What are your suggestions for young artists?

I am a huge proponent of artists being able to understand, control and generate technology systems. Learning and exploring early how to make music with software and code is a skill set that any musician should want to understand in the 21st century. Whether it is to take your skills as a traditional performer or composer and share them in real-time across networks using real-time audio streaming systems or streaming your sound to others, leveraging technology platforms to create instruments. In a perfect situation, the kind of students and researchers I love to work with, are multifaceted people, who have deep knowledge in all of these areas. To reach levels of complexity in technology and musical systems is helpful and necessary. That does not mean that engineers and designers need to work on their own, as I often do, but it gives you the ability to work on these systems without relying on someone else’s contribution. I encourage all of my students to learn more about technology and music and bring them together because we can do a lot of work without having to rope others in at the same time.

Source: Rob Hamilton

Composer and researcher Rob Hamilton explores the converging spaces between sound, music and interaction. His creative practice includes mixed and virtual-reality performance works built within fully rendered networked game environments, procedural music engines and mobile musical ecosystems. His research focuses on the cognitive implications of sonified musical gesture and motion and the role of perceived space in the creation and enjoyment of sound and music. Dr. Hamilton received his Ph.D. from Stanford University’s Center for Computer Research in Music and Acoustics and currently serves as an Associate Professor of Music and Media in the Department of Arts at Rensselaer Polytechnic Institute in Troy, NY.

--

--