SANDWAV: Manipulating Sound with Sand

Tyler Gumb
5 min readNov 13, 2016

--

SANDWAV is a sand based interface to manipulate music

Currently existing interfaces for manipulating and creating music focus on precision at the expense of a simple and organic interaction accessible by all.

Our aim was to create an interface that would allow users to find pleasure and escape in manipulating sound. To achieve this we combined the tactile experience of playing with sand with an exploration of sound. SANDWAV allows you to control music by reshaping sand.

You begin by placing sand onto the white elevated platform. You are free to sculpt the sand on the platform as you wish and hear sounds change and new music emerge. When you are finished clear the platform of sand and the sounds you created will fade away.

Inspiration

Our initial inspiration came from the idea of a zen garden. We wanted to capture the calm relaxing interaction with the stone and sand and augment that with music.

First Iteration

We lined the bottom of a cardboard box with five hall sensors (sensors that responds to the intensity of a magnetic field) and then covered them with sand. Sand ripples were shaped around the sensors to provide hints of the points of interaction.

We covered magnets in clay to create “rocks” to test the interaction.

Using serial communication, the signal from the hall sensors was sent from the arduino to processing.

We coded sensors to trigger sounds to play if the magnet was close enough. Once the sound is playing the hall sensor value also is mapped to a number between 0.25 and 3 to adjust the rate of playback of the sound.

We did four rounds of user testing on this prototype. Through our research, we found the prototype had some issues. First, the relationship between changing the rate of a sample and triggering it to play was not clear enough to most users. In the demo we showed, only one rock could change the rate of the sound and the rest were used for adding sounds. Second, the sensor range of the hall effect sensor was very limited. Third, some users instinctively reshaped the sand to see what would happen. But nothing happened, which made them feel a little confused.

We realized our goal was to create an interface that had maximum reactivity. The user should not have to play detective to discover the points of input in the device.

Second Iteration

Through this user testing and experimentation we decided to radically change our idea. Instead of using hall sensors which had way too limited of a range. We decided to experiment with light sensors. We predicted that putting light sensors under a sheet of white acrylic with some transparency and shifting sand on top of the surface would be a way to achieve interactivity through the sand.

Our second iteration: a cardboard box containing the arduino and light sensors attached to bottom of white acrylic

This setup proved effective but we still had to iterate in the processing code on exactly how to use the signal.

Our first thought was to have any sensor once covered to trigger a single track and then all the sensor values would be averaged together and mapped to change the playback rate of this one track. The effects produced from having one track were a little simplistic.

We attempted to add hall sensors in addition to the light sensors to complicate the interaction but the range of the sensors was too small to go through the sheet of acrylic.

We were all sure that having five separate tracks triggered and manipulated by each light sensor individually would be too cacophonous, but we tested it anyway. To our surprise it produced a very compelling avant garde sounding soundscape. Every time we ran it different combinations of sounds would emerge and surprise us and most important of all the interface was highly reactive. Most movements of sand on the surface produced a noticeable change in the sound.

The response from users we tested it on was much more positive than our previous setups.

One issue remained of how to make the interface easy to understand without instruction from us or a description. We designed three different patterns for the white panel and ran five rounds of user testing on them. According to feedback, most of them thought the shape of circles (left image in the figure below) looked like targets, which made them want to place the sand on them.

Polishing

Final touches we worked on were getting the program to loop without error and allowing users to end the program by wiping the platform clear of sand.

We encountered an error in the processing sound library where audio files did not loop and produced the error message “ERROR: /node/set: Synth 1 not found.” A long time was spent trying to figure out what was going on until we found on a forum that it is an error with stereo files. We converted all the files to mono and that solved the problem.

The program was written to fade out all the sounds if all sensors have been triggered at least once and then if all sensors were uncovered for three seconds. We put these requirements on the fade out because we wanted to ensure it would not be triggered while the user was playing with it and inadvertently removed sand from the sensors for a moment.

SANDWAV was displayed the SVA IxD Open House 2016

Technical Notes:

Schematic of Wiring

--

--