Pulse Technical Interview

Iain Nash
6 min readJun 5, 2019

--

This article is complementary to Pulse: Pulse: A Raw Bioimmersive Experience, a project conducted in coordination with Lennie Zhu.

The technical implementation of this idea became complex rapidly for several reasons.

  1. As the experience was self-guided, we needed a user control interface.
  2. We wanted to collect basic information — names and phone numbers of participants — with the aim of collecting feedback and identifying participants for a future series of related installs.
  3. People needed to be guided set up the biofeedback heart-rate sensors themselves.
  4. Lighting intensity and heartbeat volume needed to be controlled as part of the experience flow of the installation, both outside and inside the canopy.
  5. We needed a relatively simple way to deploy all applications, servers, and additional input/outfit components.

Information collection

An iPad-based interface was chosen allowed for both simple data collection and gave participants a familiar interface to self-guide themselves. The iPad showed a web interface communicating over WiFi to the show computer. A full computer coordinated the installation using desktop programs to control lighting and synthesize the multichannel heartbeats.

The iPad showed a static web application communicating with a Node.js based controller interface. The control program acted as the installations’ nervous system handling the signaling used to create the flow and coordinating the sound, lighting, and data programs over the MIDI protocol. The application collected user feedback, provided timing to trigger different events applicable to the experience, and gave users an explanation of the installation and a calibration step to ensure they were getting a good reading from the heart rate monitor.

<replace screen grabs with 1 demo recording of final app flow with updated copy & styling>

Using a new framework, Vue.js, to prototype the installation worked out well. The integrations and plugins made socket and real-time communication between the coordination program and the interface easy, suing Vuex and Socket.io. Lennie started prototyping the interface wireframe and worked with Iain to code the interactions within the interface. This project required a fusion of physical installation design and construction with technical synthesizer design, user experience design and application development.

The user’s experience began with an introduction and a screen to share contact information allowing us to continue the piece with willing participants. After entering information, we transitioned to a screen that guided the user to setup and check the signal from the earlobe-attached heart rate sensors. After the sensors were setup, the screen faded in the prompt to gaze into each other’s eyes for 4 minutes and faded in the heartbeat before fading to black. A final touch was added to skip the four minutes in case participants left the experience early (four minutes is a long time in this day and age!).

Biofeedback heart rate sensors

The next step was finding a way to capture live heart-rate data within as easy, intuitive process for participants. Setting up on-body microphones correctly would be difficult and we wanted a solution that allowed the participants to move around a bit without being jarred by additional sounds. After searching for and trying multiple options, clip-on earlobe heart-rate monitors proved to be the most robust, yet elegant solution. Other options evaluated were building custom bracelets or using a custom apple watch app to synchronize heart rates. We opted for a wired controller to allow for easier connecting and debugging. We ended up using older model heart rate sensors interfacing over a standard grove connector and Arduino interface board. The Arduino used USB serial communication to interface with the control program.

The second component was the Pure Data based synthesizer, based off a project called LovelyHeart to synthesize a realistic-sounding heartbeat from EEG signal patterns. The project only synthesized one heart rate, so after learning Pure Data’s interface, cloning signals to make two independently-controlled heart rate synthesizers with volume and BPM controlled via MIDI. I used a four-channel composite audio output, which I then split into two different output devices (one was the Bluetooth audio device and the other the analog system audio output). One issue I ran into with the software was the output settings needed to be configured before enabling the synthesizer, otherwise the second two channels were never output.

The show control and timing were coordinated from messages sent on the front end iPad application to the main Node.js coordinator application. The coordinator used MIDI to communicate to the Pure Data synthesizer and the QLC+ lighting cue control app. The controller application used a USB serial port interface to communicate with the Arduino, and a websocket connection to communicate with the iPad. Using Node.js was great in this situation since it made managing many connections at the same time relatively simple — event handlers were set up such that they sent notifications to other modules that had connection state and logging setup separately.

The installation was technically loosely coupled, allowing it to still run even when all systems were not coordinated. When the Arduino’s updates stopped coming through, the previous heartbeat timing was preserved. Additionally, skip functionality was added to allow the four minute segment to finish early. A mock MIDI heart rate generator was also written to allow for easier UI and system testing, as different modules could gracefully fail or not be connected.

Lighting & volume control

Lighting cues were set using the QLC+ app to differentiate between a currently running “show” (when there were people listening to each-other’s heart beat) and an idle state when people were reading the overview or not in the canopy, or when switching participants. A message was sent, “show_start,” that triggered a MIDI message to toggle a virtual lighting cue. Afterwards, another message was sent to clear all cues, then to toggle the “idle_state” cue which was programmed to brighten all of the lights.

For volume control, a MIDI control message was sent on “show_start” to turn the synthesizer volume up for both participants, and afterwards the experience, MIDI control messages were sent to turn down the synthesizer volume. MIDI messages were routed at all times between the heart rates published by the sensor over USB Serial to the synthesis application.

Deployment

During the show, the desktop applications and servers were running on a Mac Mini, aside from the installation. Wires ran from the Mac into a suspended cloth-covered electronics enclosure containing the Arduino and Audio/Power connectors for the headphones. Two string lights illuminated the inside of the canopy, a few static wash lights were set , and a large incandescent lamp outside let participants know when the four minutes was over and when people were able to enter. Setup was accomplished by following a set of directions on the Mac Mini’s desktop using the iPad to VNC into the mac to start all the applications that worked together to run this experience. The installation had no dependency on the internet, only utilizing WiFi to connect the iPad to the local web server running the front-end application and socket server.

Next Steps

The next steps for this project are to improve the 1) wired hardware, 2) dependencies on a desktop environment. Making the installation even more modular, mobile, and moving the controller code to a mobile device we can more easily setup this installation and continue iterating by allowing it to run on a more wireless, lightweight hardware setup.

--

--