Pulse: a pressure sensor for touch screen devices

Concept

Until a pressure sensor can be developed that is small enough, transparent and energy efficient, there is an inherent need for pressure as a form of input on touchscreen devices. Pulse is a wirelessly connected pressure sensitive button with a transparent center that can interact with touch screen devices. Pulse functions as a stand alone component and uses MIDI software to connect to the user’s device, so that it can be mapped to an assortment of functions.

Pressure sensitivity has numerous use cases on touch screen devices, so our team decided to focus on creating a MIDI controller to demonstrate the functionality. Currently MPKs are widely used in both a studio and live performance settings to create and perform music, so we started our ideation by exploring some of the tradeoffs of the devices. MPK’s can be expensive, so we decided to use a conductive carbon paper as our pressure sensor to cut back on cost. In addition we decided to take advantage of the existing hardware in touch screen devices in order to cut back on cost.

Using conductive carbon paper in our phicon, we could send electricity through the device, which would change based off the pressure applied to the surface. This change in voltage is very accurate and can be translated into use cases such as the pressure from a paint brush or velocity from a drumstick on a drum head.

For our implementation we created an iPad app that would recognize the Phicon (Physical Icon). The hollow space in the center of the Phicon allows the an icon, or a color to be displayed, so that a single Phicon can have multiple use cases.


Product Demo

Ideation

We started ideation through discussions about our end goals. We knew that we wanted to work with music and create a product that allows people to easily create and perform live, or studio music. We were heavily influenced by MIT’s “Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms” paper by Hiroshi Ishii and Brygg Ull. We recognized that current technology is more visual-based and has lost the aspects of haptic feedback. Therefore, we decided to combine both visual and haptic touch in our design.

After clarifying our goals for our end product, we decided that we wanted to create a pressure-sensitive music controller. In order to come up with the designs for this controller, we immediately started ideation through sketching.

The drawing on the left shows the pieces on top of an iPad screen. The pieces are in hexagon shapes, and they contain a compartment that allows the pieces to be USB-charged. They are velocity / pressure sensitive on top and bluetooth enabled.

The pieces can be arranged just like a drum kit. The user can arrange the pieces as they like! Pieces are programmable and can be assigned any function.

Sketch of the UI interface and the phicon hardware
Sketch of the UI interface with play, stop, mic, repeat and volume.
Sketch of the UI interface and the phicon hardware working together.

Goals

Our main goal was feasibility. We built our prototype as a test to see if the concept of the carbon paper layout inside of our phicon ring would correctly function. Because the structure of each phicon is quite cheap to produce, we believe that if this technology was further developed, it would offer a price-accessible way to create music. Therefore, we wanted to test the functionality to see if these cheap materials (Carbon paper, acrylic plastic, cheap non-conductive material, and aluminum foil) could come together to produce a phicon that can send signals to elicit sound upon physical touch.

Implementation

Keeping with our theme of controlling velocity for musical instrument sound generation, we built a fully functional phicon. The phicons were fabricated using laser cut acrylic, conductive carbon paper, a thick piece of plastic harvested from the trash, a piece of aluminum foil, scotch tape and some wire. The phicons was wired through the trigger inputs of a Roland SPD-S sampling drum machine that translated the pressure from the phicon into a MIDI message routed through the USB port on an Apple MacBook Pro to Ableton for sound generation. Sound was then routed out of the MacBook through the headphone jack into a 4 channel mixer and then to a pair of powered desktop speakers.

Ableton setup.

Phicons were initially cut from a 12” x 12 “ x ¼” piece of clear cast acrylic. The patterns were created in illustrator and imported into Rhinoceros befor cutting on the CoMotion VLSI 4.60 Laser Cutter.


Phicons were assembled in a sandwich form. Solid acrylic pieces were used for the top and bottoms. In between was an acrylic cutout of the shape (hexagon or circle), a conductive carbon paper ring, a plastic insulation ring and an aluminum foil ring.

A look inside a rectangular version of the pressure sensor

The Finished Phicon

Once the phicons were assembled, they were wired into the MIDI trigger in and routed to an MacBook Pro running Ableton for testing.

Evaluation

We believe that our end product was successful in proving the feasibility of our design. Unfortunately, the pressure-sensitive aspect of our phicon wasn’t totally proven because the MIDI drum pad that we wired the Phicon to was interpreting the change in voltage from the pressure sensor to be a hit on the pad. In order to map the pressure sensitivity, we would have needed a programmable MIDI interface, which was not available due to time constraints. Future iterations of Pulse could have either wifi connectivity, or bluetooth instead of wiring it into a MIDI device.

Analysis

During our product demo for the HCDE showcase, we had a wizard of oz setup that faked the connectivity between the iPad and the phicon. Most of the people that interacted with the interface believed that the two devices were connected and sending sound information, which helped us analyze the effectiveness of our prototype. User’s liked the ability to tap the phicon to synthesize sound, and wanted the ability to change more of the controls based off the position of the phicon on the device.

In our product demo video we showcased the ability to move the icons around the screen to control the midi software, which could be a future implementation. For future iterations of Pulse, extending the ability to perform touch based gestures on the surface of the screen could provide users with more interactions to manipulate the underlying text, or UI elements.

In addition to using Pulse as a MIDI controller, there are many other applications that could be explored, which include accessibility, where a user could place the phicon on top of the screen to enlarge and interact with text and UI elements on the touch screen device. Another possibility is integrating Pulse with apps like Photoshop and Sketch to provide pressure sensitive input for content such as brush strokes.