Behind the Screens: Marije Baalman
An interview with livecoder and performer Marije Baalman.
Marije Baalman is an artist who works on the border between art and technology. Although she started as a music performer, her work has become more diverse over the years. It became more conceptual, and though sound is still an important medium, she doesn’t restrict herself to just sound anymore. Theatricality and concept have started to play a more important role.
In this interview we’d like to talk about her practices and tools, as well as work in the community and the way she copes with the radical and drastic changes in her practice resulting from the corona crisis.
What is your first encounter with live coding and what are sources of inspiration?
That goes a while back. I guess in 2003 I had my first encounters. I also met in that year Ge Wang, who made his first release of Chuck, at the International Computer Music Conference 2003 (ICMC). At that time, I did not own a laptop, so it was not something I was interested in. Then in 2004, I took SuperCollider classes with Alberto de Campo, who was part of Powerbooks Unplugged, an ensemble of livecoders, who viewed the laptop as the new guitar and in their performances always sat amidst the audience, using the laptop loudspeakers only. In 2005 there was also a Toplap night at C-base in Berlin during the Club Transmediale (now CTM) festival, with performers like Craig Latta, and the duo KlippAv (Nick Collins and Fredrik Olofsson), and perhaps also Powerbooks Unplugged. I don’t recall exactly, and I don’t seem to be able to find the program online anymore. At conferences like the International Computer Music Conference (ICMC) there were already several papers and performances featuring livecoding, I remember one battle in a incredibly hot venue in Barcelona where Ge Wang and Nick Collins had a livecode battle between Chuck and SuperCollider in 2005.
I was in close contact with Powerbooks Unplugged, specifically Alberto de Campo, and I even once performed together with them in New York, in an off-broadway venue. That was quite hard, as it actually required quite a bit of practice to do this, also to get into their performance practice. But it was inspiring. And I think their approach of the laptop was interesting and something that has been inspiring me: to really see the laptop as an object, as a musical instrument, and not something that just happens to be there and is hidden on stage on big tables with a mess of cables amplified with a huge PA (public address system, red.). I think that is something I have taken along in my own livecoding performances.
The first time I used code as an interface for a performance was in Schwelle II, this is a solo dance performance with interactive sound and light. I developed the sensing system and did the interactive sound design for this show. During the show, I was more or less the performer in the back of the room, playing together with the dancer, Michael Schumacher.
There, using code as an interface, it was just a way of controlling the cue list during the show, while still having the flexibility to change things on the fly. I think the most dangerous thing I did during one of the performances was live debugging. Luckily, it worked out and the system didn’t crash. That would not have been cool with a couple of hundred people in the audience. Afterwards, it turned out it was also not the most clever fix for the bug.
But the excitement of coding in front of an audience is interesting. I’ve given presentations to people who work in ICT, and they are quite nervous about doing something like that. It’s the fear of making mistakes. But making mistakes is an essential part of coding, of problem solving: it is like taking different viewpoints on a problem and that is part of the process. Process is performance. The risk of failure is what makes performance interesting.
Then in December 2008, a friend of mine, Kassen, asked me to do a livecoding performance at De Vinger in The Hague. I was a bit sceptical, but took the challenge. I started thinking about how I could make it essentially about livecoding, and I decided to make the choice to only use the microphone as the source sound which would be manipulated by livecoding. Then later on I also added other ‘sensors’ in the laptop to control the sound, which would be mapped to the sound processing through the coding. At the time, it was possible to use the internal accelerometer that is on the harddrive and get the data from that. So it became a real laptop performance, where the laptop is on the lap. This is the performance ‘Code LiveCode Live’.
What I’ll play for the 10 minute challenge is an adapted version of this performance. Modern linux kernels don’t allow me to access the data from the accelerometer anymore, although on my current laptop I can access the touchpad, and get interesting data out of this sensor.
I think it is also a fitting performance for this Covid-19 time, where our lives are locked in to the laptop so much through working at home and having mostly online meetings and screenings.
Which platform do you use and why?
My main tool is SuperCollider. Since I first saw it in 2001/2, when Alo Allik demonstrated it and gave some classes about it at the Institute for Sonology (and also Joel Ryan gave some demonstrations), it appealed to me, more than the other environments that were shown back then. When I bought a laptop around 2004, I specifically wanted to do SuperCollider, and was in the lucky position that Alberto de Campo was guest professor at the TU Berlin, where I was, and I could learn from him. By now (16 years of using it now) I am very familiar with it, and it is still my main tool.
The sensor platform that I first developed in Montreal (with a team of wonderful collaborators), the Sense/Stage MiniBee, is also a core element in my performances, that I use often. I still develop this platform further. Sometimes based on other artists’ needs, but the development is heavily driven by my own practice.
If necessary, I also use other tools, or build them myself. That was one of the reasons to choose the Linux platform, as it is much easier to build your own software than on other operating systems. I’m solely using open source software for my artistic work, as it allows me the flexibility to change the tools to how I need them to be. Or to create small software modules that I can interconnect as needed.
But mostly Python or C++, where I then make use of open source libraries to achieve particular tasks (like for OpenCV for computer vision, liblo for OSC messaging, etc). Often it is just wrappers to make particular programs generate OSC data to use in SuperCollider, or to translate from one protocol to another.
How has live coding influenced your way of making things?
Livecoding is a sense of fluency of making programs on the fly, creating interactions between concepts. It allows to iterate fast on concepts and try things out as you are doing them. This is useful not only when performing live, but also during rehearsals for other type of works, where in the end the code will be pretty fixed, but in the development process you need to be able to make changes quickly. The projects that I do where I don’t have this possibility (like the projects I did with kites, or an installation like Just Noticeable Difference, where I programmed vibrations to be experienced in a dark space), I really miss it, and I face this other challenge: programming, trying out a piece, remembering what needs to be tweaked and how, and then trying it again. That is a much slower process.
The pieces that I make that involve, or are focused on, livecoding tend all to be conceptual, I have specific questions that I want to explore with the pieces, and they always reflect in one way or another on livecoding as a practice:
- Code LiveCode Live I already described above
- Wezen — Gewording is about the link between gesture and sound, and uses code to explore these relationships.
- Etudes pour le LiveCoding a une Main is exploring the use of a different interface to code: the one-handed keyboard Twiddler, and puts practicing the livecoding instrument into a classical context of etudes to learn playing an instrument. It also asks questions like: is playing back a script from a previously livecoded performance still a livecoding performance? Actually studying this one livecoded performance, one by Fredrik Olofson in 2007, I learnt a lot about the thought process of Fredrik during that performance. In the performance I also engage in a dialogue with him, back in 2007, reacting in my comments on his comments to the audience.
- GeCoLa — the system used in ‘the machine is learning’ is looking at using gestures to write code.
In a sense with these performances I am breaking down the laptop, which seems to be the core of many livecoding performances.
In 2010 for the Linux Audio Conference that was held in Utrecht, I was asked to host some kind of competitive livecoding event. Livecoding competitions were kind of a thing back then in the Netherlands (organised by Marcel Wierckx), and I wasn’t really fond of the concept. Music is about collaborating, rather than competing. So I took my own approach and created a kind of Olympic games, where participating is more important than winning, and combined the event with an idea from Alberto de Campo, who had made a kind of ‘Oracle’ which would prompt the participants with a set of units to use to create a new sound or pattern with.
So in a sense, although I use livecoding, I am also critical of it as a performance practice.
Find more of Marije’s work here:
— — —
This article is part of the 10 Minute Livecoding Challenge by Creative Coding Utrecht and Netherlands Coding Live — a series of events where digital artists and live coders create a piece in ten minutes.
The 10 Minute Live Coding Challenge is sponsored by Stimuleringsfonds Creatieve Industrie.