Defining and validating our design decisions through prototyping

How we used a quintessential part of the human-centered design process to get closer to our final concept.

Alina Peng
Design Intelligence
6 min readAug 10, 2018

--

After weeks of research and ideation, our team came up with Thea (stay tuned for the case study, video, and microsite), a concept for an artificially-intelligent, on-the-go navigation assistant for the blind and visually-impaired community.

Thea first interprets a user’s natural speech, similar to voice assistants like Siri or Alexa, and then provides non-intrusive audio and vibrational feedback to communicate granular directionality.

In the design process, we went through a phase of prototyping to unpack our research and generate a tangible representation of our product concept. This enabled the transition from initial ideation and conceptualization, to actual creation and solidification. Through multiple prototyping iterations, we determined how Thea would look and feel from a user’s behavioral perspective. Ultimately, we were able to deepen our understanding of Thea’s potential, improve gaps in our design, and, to some extent, bring Thea to life.

Prototyping the form factor

Initially, we had difficulty in determining Thea’s form, so we made paper prototypes of different types of wearables. Some of our rudimentary concepts included vibrational necklaces and belts that would provide 360° positioning and a circular patch that could offer cardinal directionality. By constructing models with low-fidelity materials, we were able to quickly and inexpensively make adjustments. As a result, we gained an experiential view of Thea’s visual attributes.

Image: A pile of our rudimentary paper prototypes; including paper gloves, belts, and necklaces.

We eventually decided on a haptic pad packaged in the form of rectangular strips. These pads could be placed on any part of the body that the user deems optimal for feeling the haptic vibrations. Users would ideally wear multiple pads that allow for left and right signals to be distributed on separate, wider areas of the body.

To build a model of the pad, we played with materials like silicone and kinesiology tape. The tape — made of cotton fiber and polymer elastic and typically used by athletes — provided structure without restricting the body’s range of motion. Likewise, silicone is flexible and comfortable and offers mechanical resiliency that can support vibration.

Image: A rotating 3D model of Thea’s pad form factor.

Prototyping the haptic language

After we decided on the haptic pad shape, we then had to figure out the nuances of the vibrational pulses by creating a pseudo haptic language. The language’s pulses should help orient the users and prompt them to turn a certain way or number of degrees or walk a specific distance. We wanted Thea to be able to communicate information in an intuitive way that also adheres to a user’s body movements. In order to prototype our haptic language and determine how exactly these commands would be communicated, we employed some body storming techniques.

Image: Darshan and fellow interns testing an early haptic prototype made of three buttons and three vibrational motors.

We first took a trip to Visions, a rehabilitation and community center for the visually impaired, to meet with a youth group. There, participants tested and identified the optimal part of the body for the haptic pad. Although we designed Thea to be rather open-ended in terms of where on the body it could be placed, we found that when the pads were worn on both shoulders, users had the easiest time navigating. With this feedback, we later conducted another body storming session in the office.

For the internal session, one of us would walk around blindfolded. Another person would walk behind the blindfolded person to provide directions while tapping on his or her shoulders, essentially simulating Thea’s haptic language. These taps let us physically experience what Thea’s vibrations would be like — we steered the user to different areas of the office through turn-by-turn steps. Body storming allowed us to put ourselves in the user’s shoes, while simultaneously testing our concept.

Image: Lauren testing haptic feedback by tapping on the shoulders of another person, simulating a haptic language.

We decided the haptic language should be a series of quick consecutive pulses that cascade down the pad. This provided the most intuitive directionality for turning left and right. We also determined that a steady, rhythmic pulsing would indicate a forward motion, and the quick pulsing would indicate the need to stop.

Image: Aperson placing a shoulder pad on to their skin. Thea’s quick, consecutive pulses indicate directionality.

Interacting with the youth group and with our office surroundings revealed flaws in our design and ensured the success of our key decisions. Ultimately, body storming propelled us to take our idea out of abstraction.

Refined lo-fi prototyping

When we determined Thea’s form factor and haptic language, we subsequently wanted to create a more refined prototype by actually engineering the vibrations onto the pads. This would act as a backbone for Thea, representing concept’s sensory skeleton.

Image: A refined prototype of Thea’s vibrations.

We assembled a quick circuit with vibration motors, wires, and Arduino, the open-source electronics platform. This circuit enabled us to easily illustrate the cascade of pulses and showcase complex design interactions.

Image: A schematic of our vibration prototype using an Arduino Uno circuit board.

Building these prototypes ultimately allowed us to continuously refine and validate our design decisions, making Thea inherently more valuable to the visually-impaired community. The act of iterating on a few levels of fidelity in our prototypes proved to be a critical part of the iterative human-centered design process — enabling us to get closer to the final product concept with every step of the way.

Every summer, interns at Moment (which is now part of Verizon) solve real-world problems through a design-based research project. In the past, interns have worked with concepts like autonomous vehicles, Google Glass, virtual reality in education, and Voice UI.

For the 2018 summer project, the premise is to design a near-future product or service that improves mobility for people with disabilities using granular location data and other contextual information. Darshan Alatar Patel, Lauren Fox, Alina Peng, Chanel Luu Hai and Alexis Trevizo are interns at Moment/Verizon in New York. Darshan is pursuing an MFA in Interaction Design from Domus Academy in Milan, Lauren is an incoming junior at Washington University in St. Louis pursuing a BFA in Communication Design, Alina is pursuing a BA in Philosophy, Politics and Economics (PPE) with a Design Minor at the University of Pennsylvania, Chanel is pursuing an MFA in Design & Technology at Parsons School of Design, and Alexis is pursuing a BS in Integrated Digital Media at NYU. They’re currently exploring the intersection of mobility challenges and technology in urban environments. You can follow the team’s progress this summer on Momentary Exploration.

--

--

Alina Peng
Design Intelligence

Curious learner. Excited about the technology of our future.