GESTURIZE: Directional Touch Sensitivity Layers for Fashion Tech

RAY LC
LOOMIA Creator Lab
Published in
7 min readDec 25, 2018

LOOMIA Creator Lab Fellow Ray LC is experimenting with ways to make clothing that can detect different gestures haptically and send the information to an off-board 3D environment. Follow his updates here.

Insights.

Gestures form the medium between what we physically interact with and the virtual world it can create. When we make gestures, we communicate our thoughts to others, but in the virtual world, gestures can now create a communication between us and the machines we created.

Recognizing gestures for on-wall display.
Gestures are how we interact with the environment.

Wearable technology provides a way to bridge the gap between our ideas of primitive movement and the digital imprint we have on our devices. Our two hands express these ideas of movement differently, and produce asymmetric movements with different speeds and nuances. Can we use garments for detecting gestures instead of computer vision, resulting in a more organic and physical experience? I explore the use of garment-based gestures to control a 3D environment as applicable in VR and games.

Gesture control patent from Google presages use of gestures to connect physical and digital.
Bridging the gap between physical and digital using gesture recognition. Is this possible on garment?

Technology.

LOOMIA’s technology can translate between movements and a digital form, be it on screen or as light pulses. We can leverage LOOMIA technology to help translate gestures into a digital imprint. I imagine two basic interaction modes: 1. simple tapping (with LED) and 2. physical gestures.

Setting up a circuit with Arduino Metro to read tapping and gestures from a LOOMIA layer.
1. Demo of tapping-evoked LED activation with LOOMIA tech.
2. Demo of physical gesture converted to digital readout using LOOMIA tech.

Gestures are recognized using the Adafruit MPR 121 capacitive touch module, with fabrication and custom code by David Choi. However, getting the gesture recognition to work on larger surfaces turn out to be difficult, as the conductive surfaces are more difficult to cover reliably.

Prototype 1: Asymmetric Forms.

Given those constraints, I set out to create a gesture set that may be captured by the technology and create a garment that has “handedness,” that is, dependent on which hand is using it. For example in drumming, dominant hands usually play out the main theme with the non-dominant hand providing the rhythm. Perhaps garment can also provide this asymmetry in its use, so that left and right hands can have different grammars like tapping vs. gesture-making.

Prototype asymmetric top for gesture integration.

For the gesture component, I experimented with a gesture set based on simple primitives that can express various ideas of dance-like and artistic movements. Use a LOOMIA layer to translate gestures into digital primitives, which can range from an emoji on display to a particle system for Unity3D. The idea is to use different gestures to generate different types of particles.

Evolution of a gesture for artistic expression in 3D environment.

I thought of embedding a tab layer on a piece of clothing which is stretchable and uses the two hands differently. However it appears to not be flat, which may not be easy to pick up the gesture points. The gestures also don’t translate well in 3 dimensions. I wanted to see how the interaction layer looks on the garment, but it’s quite big. Tapping works, but only near where the lights are, and not on the main tab. I needed a more 3D like garment to evoke the idea of following body contour in making 2D gestures in 3D space. It should be used for dance-like or live-painting-like performances where one hand produces rhythm while another draws out shapes interpreted as gestures.

Prototype 2: Simplification.

Visualization of prototype 2 on CLO3D, note simplification of forms.

For the next prototype, I went about redesigning the garment to have a simpler appearance, with the lower large piece for gesture interactions and the upper left piece for simple tapping, so we can have both of the functionalities of movement-based interactions: gestures and tapping only. However I still wanted to have a more 3D looking garment that evokes the ideas of movement. In redesigning the garment, I realized that wearing the sleeve hole as the head and making the collar a shoulder hole actually makes the garment amazingly 3D on top while still preserving its flatness on the bottom for gesture recognition. I tested the tapping part with models, and the tech seems to work.

Wearing the garment asymmetrically with head through the right arm hole produces a 3D look.
The old neck hole is now for the shoulder. Gesturing is still available to the right hand.

The gesture set for the 2nd prototype is simplified as well, as we found that the capacitive recognition technology had trouble with complex gestures. We narrowed it down to five simple primitives, with a possibly more complex circular motion for the sixth to try if possible.

Prototype 2 in construction: playing with different places to fit arm, neck, waist for comfort and look.
Simplified gesture set for prototype 2 to fit single gesture panel layer, used to evoke particles in fireworks.

Creative Technology (Code).

In order to interpret gestures from the gesture recognition layer and use them to spark 3D motion, I decided to test out a wifi system of Arduinos using two appropriately small but powerful Teensy boards that are coupled to XBee wifi transmitter/receivers using a specialized adapter. This ensures the smallest profile for wifi communication embeddable in clothing that is still powerful enough to adapt in the future. There will be a pair of XBees coupled to Teensys, one on the garment side, and one on the computer for interpreting the gestures. The Teensys would be powered by small portable batteries.

XBee wifi transmitters for delivering gesture hits to the computer wirelessly.

On the computer side, I went with a custom-made fireworks display system in Unity3D to interpret each gesture as firing off (or terminating) certain sparks in a 3D environment. In the video below, I generated artificial gestures that evoke different particle effects in Unity. I also generated terrain to go with the particles in Gaia and Gena.

Prototype of gesture-generated particle effects in Unity3D based on interpretation of garment gesture layer content passed to the computer via XBees coupled to the Teensy boards.

With the coding in good shape, I went back to the layer technology side and realized that a even simpler set of gestures and commands will be necessary to get reliable effects. The tapping effect is also complicating, so we will go with demonstrating the most novel aspects, which is the gesture recognition. The idea is to have a 3D looking garment for 2D gestures in 3D space, and to produce a system for rhythmic and shape gestures for dynamic interaction.

Using LOOMIA layers coupled to Unity3D using a wifi Arduino system, I envision a performance interaction garment with different motifs for each hand that displays both movement rhythms on ones own body and also gesture-based interactions on the screen or projection: a type of gesture interaction embodied in our own asymmetric actions.

Prototype with interaction.

The final product with the scene displayed on the computer. Ideally we connect to a VR headset and do full body tracking to get the whole experience of tapping and gesturizing on your clothing to get different patterns of particles generated in different spatio-temporal dynamics in the scene. The XBees are hidden in a pocket on the inside of the garment, so the wearer can actually move around in space while gesturizing the particles if she wished.

In this project we were working with components that were still being developed at the time of construction, but with the release versions coming now, they will be more effective, robust, and generate more reliable and satisfying interactions with the virtual environment. For further refinement, I would use a smaller patch that generates reliable gesture recognition, add it to a shoulder or accessory patch, let the model move around, dance in space, and generate a projected VR image that also lives in an abstract data-based world. The XBee and accelerometer setup on the Teensy will allow us to create a garment that expresses the wearer’s creative movements both on the micro (gesture) and macro (rotation, locomotion) scales onto the virtual environment in the form of particles, their locations, and their different forms.

We would be able to hang our emotional expressions on our sleeve, literally.

--

--

RAY LC
LOOMIA Creator Lab

RAY LC explores our own stories about the way we adapt to technologies. He founded the Studio for Narrative Spaces: https://recfro.github.io/