Augmented Body

Sojung Pak
23 min readAug 29


Exploration and Conceptualization: Research into sensory perception, interactive design principles, augmentation techniques, and methods of altering human senses.


Sensory Perception:

The ability to process stimuli in the environment -> how we learn and gain information about the environment using our senses in order to identify and respond to the stimuli

  • top-down (our experience) vs. bottom-up processing (sensory input)
  • sensation is a physical process (signal), while perception is psychological (response)
  • attention and motivation determines what we sense/perceive
  • sensory adaptation: no longer perceive stimuli that is constant
  • cultural factors and personality also play a role
  • so what are our senses? vision, hearing, smell, taste, touch, pain, temperature, light, mechanoreception (vibration/pressure), etc.
  • and what are the relationships between the senses—how do they combine to inform the body? synesthetic experiences?


Interactive Design Principles:

Visibility: the more visible an element is, the more likely users will know about them and how to use them

Feedback: making it clear to the user what action has been taken and what has been accomplished

Constraints: limiting the range of interaction possibilities for the user to simplify the interface and guide the user to the appropriate next action

Mapping: having a clear relationship between controls and the effect they have on the world

Consistency: having similar operations and similar elements for achieving similar tasks

Affordance: an attribute of an object that allows people to know how to use it


Augmentation Techniques:

Human augmentation is the ability to perform actions, whether physical or mental, with the help of tools that practically integrates into our bodies — pushing the limits of our natural capabilities.

  • increase in our natural abilities
  • replication (of ability, ex. prosthetics)
  • supplementing (enhancing existing ability)
  • exceeding (abilities that go beyond human limits)


Methods of Altering Human Senses:

  • use the brain’s neuroplasticity (its ability to constantly change throughout an individual’s lifespan) to expand and augment our senses
  • chemical haptics: system of wearable patches that stimulate the skin through five different chemicals they contains
  • sensory illusions: overstimulation of the senses (color, weight, smell, auditory, tactile)
  • sensory rivalry: one stimulus inhibits the perception of another
  • techniques like amplification, dampening, distortion, isolation,
    substitution, synesthesia, biofeedback, wearable technology, chemical
    manipulation, and mindfulness and meditation



Need to decide on what sense I am focusing on so I can narrow my interaction, but I think I want to work on creating some prototype/wearable technology that enhances/replaces a sense for someone who has an inhibited sense—something with a very clear function. I also need to consider how it would be used in a group setting and in what context that would be necessary.

Sensory processing disorder (SPD): extreme reactivity to tactile, auditory, or visual stimulation, living in a state of perpetual flight, fight, or freeze in reaction to stimuli -> what if the technology could detect changes in bodily response and use other senses to bring the body to a homeostasis state?

How can other senses replace/replicate the same bodily response for another sense? focusing on techniques of substitution

  • vision, hearing, smell, taste, touch, pain, temperature, light, mechanoreception (vibration/pressure)
  • pain: touch and mechanoreception (vibration/pressure) -> not sure if I want to focus this project on pain because to trial it you would have to inflict pain in some capacity
  • taste and smell are highly connected to one another
  • hearing and mechanoreception (vibration/pressure) (ie. soundwaves to listen to music) -> want to do more research on how the experience is different for the deaf (leaning into the idea of synesthetic experience where one sense is understood through a different sense and creating an experience for the hearing impaired to have a unique experience while educating others on what that experience is liked). I think vibration and hearing are something that is common for the hearing impaired, but it isn’t understood as well by people who are accustomed to being able to hear.
  • I think it would also be helpful to look at existing examples of technology to get a better sense of limitations for what can be done and gaps in current research


  • 7Up Concert for the Deaf (also thinking about those with a hearing ability to go into the experience inhibiting that ability to better understand that synesthetic experience for those that are hearing impaired). This includes a wearable vest with vibrations and visualizations of the music.
  • Haptic Suits
  • Sound to Paintings
  • Maybe it goes beyond just a wearable device to make a full exhibit/ spatial experience

The most common form of synesthesia, researchers believe, is colored hearing: sounds, music or voices seen as colors. Most synesthetes report that they see such sounds internally, in “the mind’s eye.” Only a minority, like Day, see visions as if projected outside the body, usually within arm’s reach.

Reading this, I can also see how projections can be incorporated into visually representing audio.

“We tend to think of our experiences, and especially the visual system, as being bottom-up,” he remarks. “But there are many instances where meaning goes back down and influences our lower-order perception of the world. Synesthesia is just one very rare and exceptional example of that.”

Synesthesia is an example of how we use our prior knowledge and experiences to inform our bodily response and understanding of the stimuli. Even without music, speech sounds can be converted to vibrations to communicate.


In this form of synesthesia, specific sounds can produce tactile sensations in various parts of the hearer’s body. Different types of music might provoke different levels of pressure, or some spoken words might feel prickly while others seem smooth or soft.

Some characteristics of auditory-tactile synesthesia:

  • It is consistent.
  • The degree of focus and relaxation is important
  • The quality of the sound also has an impact
  • The tactile effect can be cumulative
  • It can sometimes be accompanied by a perception of color
  • It often coexists with auditory-visual synesthesia

Based on this, I do think I want to focus on the auditory -> visual and tactile synesthesia.

The Experience:

I want to replicate the synesthetic experience, which is rare for most people, but would enhance the experience for others. The first sense would be vibrations, and then the second sense would be visual. I’m unsure if it makes sense to just try to replicate the synesthetic experience because it’s something that already exists, but is just rare for most people. But since I feel that my idea is encouraging more interaction with the external environment and other people in a way that most people haven’t before, it makes sense.

I was also reminded of videos I watched where deaf people try to replicate or predict sounds with their mouths. I think this could also be an interesting concept to see the difference between expected sounds and actual sounds. Similar to this, most people don’t visualize or feel sound as those with synesthesia do, so they have expectations as to what it could visualize/feel like.

  • making sounds yourself -> interacting with the environment and the objects in the environment
  • listening to sounds -> vibrations
  • music? -> layering sounds together
  • creating sounds reflects on the outside environment? maybe you can see certain animations (ar-style?, outward projections) in the world around you while feeling the sounds on your body
  • If it’s something wearable, it would have to be full-body to replicate the full synesthetic experience OR if this is too difficult, focusing on the tactile experience of just the ears (ie. ASMR) -> do more research on types of wearables
  • Have people first predict or try out what they think it would feel/look like and then actually feel/look what people with synesthesia feel/look? There could also be an opportunity to see what other people thought—people with and without a hearing impairment in order to see those differences and better understand the experience of either group.

Things to consider:

  • What is the external environment? Is it an installation-type space with screens around for the projections? Is it inside or outside? Is the device transportable so that it could take place in any area? What interactive objects/artifacts are placed in the environment?
  • How is the technology working? Is it a wearable device? Is it built into the physical environment? What senses are being blocked and what senses are being amplified?
  • What is the overall experience (story) and what is the purpose? What should users leave with?

Readings (09/5)

Dallas Taylor Video: What Silence can teach you about sound

John Cage’s 4'33" seems more like a performance and experience rather than a musical piece. I think my first thought hearing of this piece was, how can it show any musicality or talent given that playing “rest” is something anyone can do?

Thinking of anechoic chambers, I think rooms like this strike me as a form of punishment, the same way prisoners are placed in rooms of solitude with nothing to do or see to pass the time. Rooms like these are forms of punishment because they go against what we need as humans to survive and be healthy. However, silence can also be used as a form of meditation and calmness. I think there’s a reason why so many people are drawn to certain sounds to help them relax or go to sleep—for instance, people listen to sounds of heartbeats and nature. Headphones these days have built-in soundproof features because people want to block out outside sounds.

Because I am looking into how other senses can replace sound, this video was useful in thinking about the experience of hearing. I’d like to focus on thinking about certain analogies to create and replicate the experience of hearing sounds around us. What does listening to a heartbeat remind me of? When we hear our heartbeats, we’re simultaneously feeling that thumping in our chest—the motion that produces the sound is also important. We also associate sounds with certain feelings and moods; these associations are how we learn about the world around us. I can sometimes rely solely on sound to recognize people or animals, such as footsteps and laughter. We also form “expectations” using our prior knowledge, so even sounds we hear for the first time can be made sense of. Our body is accustomed to automatically respond in reaction to noises. If I hear a loud thump that catches me off guard, I’ll be alert and possibly scared. As I was watching this video in studio, the main sounds around me were the two fans in the room. The sound of the fans kept changing as they rotated left and right, showing how spatial perception can also affect sound.

Lupton/Lipps Part One

This article in particular was helpful in learning about existing technologies and thinking about various perspectives. I was especially intrigued by the aspect of “how things shape us”, whether it is as extensions of our bodies or providing a response. The idea is that when we interact with the environment around us, we also leave our traces in the environment from that interaction. I also found out that deaf people use reflective spaces to see what is happening around them. I think this would be an interesting aspect to consider in my design ideas.

I was also immediately drawn to Liron Gino’s listening device. I think I was struggling to understand how might a full-body vibration experience work without a full-body suit, but I was impressed by how this device has removable and attachable modules that make the experience customizable. I think something like this would also be useful for my experience where people could guess where sounds could be experienced on the body. This could be done on their own body or on the environment around them—the exhibit space itself could become a reflection of the human body.

Explore existing experiential design projects, wearable technologies and artifacts, and artistic experiments that have successfully impacted sensory perception.

Other than the examples I’ve listed already, I decided to look into more examples of wearable technologies that transform sound into vibrations. I also wanted to find examples of technology that experimented with how sound can be associated with color and other visual components.

  • Neosensory wrist device: sends vibrations via the nervous system to the brain (haptic feedback), example of sensory substitution
  • Woojer: “Oscillating Frame” that accurately reproduces frequencies as physical vibrations, wearable strap that goes around chest/torso
  • Ontenna: Hairpin that uses vibration to relay sounds to the wearer
  • Oh!: record-shaped speaker, converts sound into visual patterns that are emitted via LED lights
  • There seem to be many examples of “wearable technologies” that are worn around the wrist or as a vest. Most use built-in microphones to detect sounds from the outside environment that are then translated into vibrations. Similar to audio visualizations, the soundwaves directly transfer over to physical vibrations, though the frequency must be rescaled. I think I need to better understand how people with synesthesia experience sound. There are many examples of sound and visual synesthesia, as many visual artists will document their experiences, but sound and touch are much harder to understand. It’s also important to note that experiences are not identical, and those with synesthesia will also have varied experiences.

Brainstorm and sketch out initial concepts for the physical artifacts, mapping out how each technique can achieve the desired sensory effects.

Initial Brainstorming

I started out by mapping my initial ideas and thoughts to get a sense of what ideas I wanted to explore. I focused on how sound can be represented with touch, replicating auditory-tactile synesthesia. I think it was easier for me to find the purpose of what I wanted to create first, so I knew exactly who my audience was and the potential uses of my technology. I knew that since I was focusing on the depletion of sound, I could focus on the hearing impaired as well as the general public. I thought it would be insightful to compare the two experiences of what both groups could gain from my design, and utilize that for the overall takeaway experience.

My second sense will most likely be visual since auditory-tactile synesthesia is very closely related to associations with color. I think I’m still a little confused about how the two senses layer together to create an elevated experience, though because sound is being represented in two different senses, it creates a nuanced understanding of the associations we hold with sound. I’m not sure if each sense should be separate for two different groups of people, or if everyone has the same experience and it can become a group experience. Given my research on synesthesia and how it requires top-down processing, I wanted to play with the idea of “associations” and prior expectations because it replicates the mental processing of people with synesthesia.


Idea 1

I wanted to dig deeper into having the physical space/environment become a reflection of the human body. That is, how can the body map out to the environment using sensors, cameras, and other technology? In this way, the interaction that happens in the body individually is directly related to what is happening in the environment. For this idea, I wanted to utilize modules that could attach to the body and the physical environment. The ones on the body would have motors that form vibrations, while the ones in the environment would have built-in microphones. Therefore, any sound captured by the microphone would translate into vibrations onto the body. I wanted the modules to be customizable in the sense that you could choose where to place them on your body, as well as interact with anything in the environment—this could be a physical object that produces sound such as an instrument, a person, or a general area that amplifies sound from interactions with other objects. For this idea, the visual sense could potentially be incorporated by having the modules light up with different colors and intensities of light.

Further things to explore for this idea:

  • How would other senses be isolated?
  • How many modules could be placed on the body at once?
  • How can this experience be collaborative?

Idea 2

Building off the previous idea, I thought about bringing in an interactive component, such as having users predict what sounds may feel like. Using different levels of pressure and textures instead of vibrations, the tactile input can be used as a communicative tool to understand other users. I wanted to have a tangible device that inputs different hand pressures and rhythms to represent the vibrations, similar to an instrument. It could convey certain feelings and moods based on the hand pressures and beats. I took inspiration from a drum, where you can utilize your hands to interact. Since there are two sides to a drum, one could act as the input and the other could act as the output to potentially share with others. The output could even emit light, but other users could place their hands on it to feel the input.

It could essentially serve as a communicative device to have a conversation without speaking or hearing, fully requiring touch.

Further things to explore for this idea:

  • Is it about an existing sound in the environment, or do you have to produce the sound?
  • How does it associate sound with location?

Idea 3

For my last idea, I wanted to think about how the interaction itself could leave a trace on the physical environment, especially for others to see and interact with. I referenced a few science experiments that used touch to understand sound, such as the use of balloons to feel vibrations and amplify touch. For this to work, it would be most effective to have a silent environment to pick up the smaller ambient sounds. I think I gravitated towards a controlled environment for this idea, where physical artifacts could be placed around the room that represent different sounds that you could feel, perhaps even making the sounds yourself. I still need to figure out how this technology could potentially work.


Talking with Luca, I realized I’ve been focused too much on a collaborative experience with multiple people that I hadn’t focused enough on the individual experience. There should be different levels of interaction, so I need to think about the micro-interactions. Maybe the individual experience focuses on touch, diminishing sound in the physical environment to focus on the ambient or smaller sounds picked up with the microphone in the module. When another person comes in, it forces engagement with louder and more distracting noises. The vibrational response between the two could be clearly different, where you’re forced to experiment with what sounds are picked up.

Also, considering how this experience could be relevant for all people and not just those who are hard of hearing, why would everyone want to use this device? I’m conflicted on whether I should remove/diminish the aspect of hearing or focus on the tangible and visual response of sound in the environment as a learning experience of the associations we make with certain sounds.

My talk with Haeyoung also ended with the same conclusion—that I needed to clearly differentiate the individual and two-person experiences. I was thinking too much from a group experience when the max would probably be two people. Hayoung also advised me to think about having the interaction on a smaller scale, as it doesn’t necessarily need to be in a large environment.


Based on this feedback, I went back to the drawing board a little and tried to think broadly using my research. There’s definitely more that can be done with sound and touch, but I think my attempt at eliminating sound was limiting my ideas. For instance, I could consider using the perception of sound at different distances.

Some intriguing ideas that came to my mind were something like 8D audio, where you could feel the sound. If there was something that you could move around the body to experience the sound at different levels based on direction and proximity to the origin of the sound. Another is using color filters to transform our sight. For example, overlaying a red color filter over our eyes makes it harder to pick up red objects in the room.

While I was looking into technologies that isolate sound such as noise barriers and sound baffles, I randomly thought of two images:

1 — A seashell. There’s a “hidden” sound located inside of the object that matches the location of where it came from. I thought this was an interesting observation of how sound is trapped inside objects. The hard, curved surfaces reflect sound waves, so the sounds from the ocean are being resonated on the inside of the shell.

2—Stethoscope. The doctor’s instrument is to listen to sounds coming from inside the body. Similar to the seashell, there are “hidden” sounds that aren’t easily perceivable from a distance or without technology.

I then thought about how we associate memories with our senses, with sound being one of them. Another consideration would be being able to hear various materials.

I decided to focus on one sense at a time, with the first being sound. Rather than eliminating sound to focus on touch, I thought about how sound could exist because of the interaction. For example, using a stethoscope-like device, users could hear “hidden” sounds. These sounds could be those that trigger memory or be associated with materiality (ie. hear water moving inside a water bottle or ticking of a clock). Similar to this article, we pick up sounds we can’t normally hear. The two-person interaction could bring in the sense of touch, where the sounds are now felt. Sounds can be combined to create vibrations and something like proximity can play a role in how strong the vibration is.

I was also influenced by a few other articles I came across in my research that focused on the relationship between sound and touch.

Essentially, sound enhances the processing of the sense of touch. When either is triggered individually, the same regions of the brain activate, so we automatically are thinking about the second sense in our brain. In this way, the interaction could happen in the opposite direction where touch comes first and sound comes second. Things could become more clear with sound, though we rely on both to understand what something is. For this interaction, I want to focus on the idea that “a sensation of touch can be produced or changed by sound, even if we are not actually being touched”. The addition of sound can change our perception of touch as we interact with objects.


Lupton/Lipps Part 2

I think this reading was particularly helpful in thinking about our current projects as they stand from new perspectives. Although we’re only looking to alter two senses for this project, thinking about all of the senses when designing other experiences is crucial to forming an immersive user experience that is memorable. I think it’s also important to think about the reaction of people in our experience, whether it is a bodily or mental reaction. They can react through feelings, emotions, and even the trigger of certain memories. The last page about color also reminded me of Olafur Eliasson’s “Your Blind Passenger” which obstructs the user’s visibility of vision but utilizes color as a wayfinding element.

Beyond Multi-sensory

I enjoyed how this article pointed out the senses beyond the five we normally consider:

  • thermoception (sense of temperature)
  • nociception (pain)
  • equilibrioception (balance)
  • proprioception (body awareness)

We engage with our environment in multiple ways to make sense of the world, so the experiences we design can actively play a role in how we may control a user’s thoughts or emotions. As this article states, I think many of the “immersive” experiences that are commercially successful lack a larger social component for users to learn from the experience. I would be interested in creating an experience that has the potential to be more user-driven so that everyone comes out with their own personal experience.


I eventually took some time to go back to researching and brainstorming a list of ideas. I landed on playing with sound and visuals, based on a few articles I read about spatial stereotypes we hold in terms of sound.

I was especially interested in how visuals may play a role in sound, as they’re essentially mapped on the same spectrum as color.

Brainstorming ideas


I decided to do some super basic prototyping to test out the light’s shape and size by cutting out holes in paper cups and placing my phone’s flashlight into it to create a makeshift flashlight. I knew I definitely wanted the shape to be longer in width than length. I took inspiration from a barcode scanner to create a rectangular shape, which is longer in width. The other more circular shape feels less like the user would move in a horizontal or vertical direction because there are no constraints in the shape. The four-sided nature of a rectangle would be more likely to prevent someone from rotating the light source.

Aimee’s very P drawings

Also consulted Aimee, who is in products, about the direction for what the actual device’s shape would look like. I wanted a form that both points towards the front but motions in a way that people would feel inclined to move it vertically up and down. We came up with a few examples of existing devices that we can work off of: scanner, gun, hair dryer. While I want to make a hand-held device, I also thought of objects that we lift up and down such as a weight as a reference:

I sketched out a couple of different options for the shape, considering the placement of features such as a button to turn it on/off, a grip, the scanner, the speaker, and a visual element that shows you how low/high your altitude is to give a visual benchmark. I thought of the thermometer for visual similarity, but I wanted to place it so that you could look at it in front of you rather than on top.

Form Considerations

After discussing the overall shape in class with Haeyoung, I used Rhino to create the 3D model of the device. I went through a few different iterations and landed on the final one (pictured below). I had a hard time figuring out how to extrude the way I wanted the shapes to look since I hadn’t been on Rhino in a while. Also, working with curves was just painful in so many ways.



I definitely think that the visual sense impacts us more at the moment, but we remember past experiences better from the other senses like smell. I’d like to think that it's because of the strong associations we make in terms of our emotional response—for example, pleasant smells induce positive emotions as was said in the reading. I thought it was also interesting how the reading mentioned responses based on frequency, rather than different types of sound. While this specific reading spoke mostly of smell, I would be interested in seeing if the same elements could apply to sound—can sound do anything to provide a sense of place and be used in spatial judgments? Would certain pitches trigger people to think a room is more pleasant from formed associations?

An Exploration of Sensory Design

I actually found and read this article when I was doing my initial research, which is where I found inspiration for some of my ideas. I especially took an interest in how the visual and auditory senses relate as a result of this article because the experience can be altered after the visual sense is introduced into an auditory experience (McGurk and Ventriloquism effects). It’s also important to note that because visual cues are so strong in terms of giving us information, simply giving cues with the other senses could lead to different interpretations of the cue. For this reason, you can’t expect people to react in a certain way with complete certainty.


Physical Prototyping

Arduino was a pretty smooth process—I knew exactly what I needed to do and had experience working with most of the parts (the exception was the button and speaker but they were relatively easy to figure out with a few Google searches). I do wish I knew how to correctly build everything though because I had so many wires and I still do not know what a resistor does, so I am very glad I did not break anything in the process (except one ESP32 but we do not speak about it).

Working Demo


I also tested out a couple of different materials for the rectangular cut-out of the front for the light. I initially envisioned a semi-transparent material, but quickly realized that all of them (even the thinnest tracing paper) would diffuse the light and I wasn’t able to achieve that rectangular shape projection that I wanted. Instead, I opted to laser cut a small opening out of a fully opaque and hard material—this ended up working best because the opening was small enough that you couldn’t see the behind-the-scenes Arduino parts inside the device and the projected shape was still relatively large enough to allow for layering in a two-person experience.

Testing out different materials for front
corrugated plastic, posterboard, no cover
tracing paper, milk jug, laser cut w hole


Designing For All Five Senses

Designing for a multisensory experience is definitely important for brands—we hold so many associations other than just visuals that help us identify what products do. The feedback they get back also aids in helping us understand that a product is working correctly. Just as senses help us make sense of the world through emotions, we can also believe things are stronger or work more effectively through the senses. Even though this project is more about creating an experience rather than a product for a brand, we see how designing for the senses is applicable to the real world.

Yes, But Why

Olafur Eliasson’s focus on recreating natural phenomena is exactly why anyone can understand and partake in his installations, especially in this era where we are often out of touch with the natural world. It’s especially notable that he designed the experiences so that people could leave with a sense of reflection and understanding. They’re given an answer to the “so what?” question of why this experience was important and how they may carry what they learned in the real world. His works also reflect simplicity and a sense of control for the participants. Everyone interacts with it in their own interpretation of what is presented to them, which makes it interesting to observe how everyone reacts and takes in the experience.