Understanding Conceptual Frameworks for Tangible Interaction

Because tangible user interfaces/interaction is a very new field of research, I feel like the media use the terms interchangeably in vastly different contexts. I myself as a student just starting to learn about the field, have felt confusion regarding the term. Moreover, every time I look at a Tangible Interface project, I am impressed and interested, but thrown off, not knowing how to process or analyze. Thus, having a solid conceptual frameworks that could “unpack why ‘tangible interaction works so well for the users’ (Hornecker&Buur)” is necessary. This post is a study note to myself; I will be summarizing four different conceptual frameworks.


Reality-Based Interaction: A Framework for Post-WIMP interfaces (2008) by Jacob, Girouard, Hirshfield, Horn, Shaer, Solovey, Zigelbaum

The notion of Reality-Based Interaction(RBI) is a unifying concept that ties different interaction styles, including newly emerging HCI techniques. Nowadays, there are new type of interaction styles diverging from the traditional WIMP (“window, icon, menu, pointing device) or Direct Manipulation interaction device. With advances in computer technology, the post-WIMP style interaction such as Virtual Reality, Augmented Reality, Tangible Interaction, mobile interaction, etc. are becoming more and more pervasive. The paper suggests that these seemingly distinct technologies share an underlying commonality: these new interaction styles builds on user’s pre-existing knowledge of reality, linking the real, non-digital world with the digital. Thus, the post-WIMP interfaces can be tied together and analyzed through the proposed framework of RBI.

How users interact with computers/the digital world has evolved from typing command lines to WIMP, and currently to the post-WIMP interaction techniques. This evolution of HCI techniques moved “interfaces closer to real world interaction…increase[ing] the realism of interface objects and allow[ing] users to interact even more directly with them.” Here, the term real world refers to the physical, non-digital world. RBI framework bases on the following four themes from the real world, which plays a prominent role in the new interaction styles:

  • Naive Physics (common sense knowledge about the physical world): new interaction styles commonly “simulate or directly use properties of the physical world”, such as concepts like gravity, friction, velocity
  • Body Awareness & Skills (awareness of own physical bodies and body coordination): skills such understanding relative position of own body, coordinating movements, commonly plays an important role in whole-body interaction
  • Environment Awareness & Skills (sense of surrounding and skills to manipulate and navigate within): embedded natural cues such as depth and sense of direction facilitates our spatial understanding; these cues can be applied in developing interaction styles that emerges the users into a new “environment” such as VR
  • Social Awareness & Skill (aware of others and skills to interact with them): since many emerging interaction techniques exploit social awareness and “co-located” collaboration, user’s skills for social interaction, non/verbal communication, and collaboration is considered

The paper also suggests that this trend of RBI is encouraging as it reduces the mental effort required to interact with new systems. However, many useful interface will not only just mimic the real world, but also include unrealistic features and commands, since the great power of computer interaction is actually to go beyond the real world. Thus, relying only on the RBI framework has implications/tradeoffs such as the following:

  • Expressive Power (functionality of a system, made simple and expressive by limiting some functionalities instead of realism), Efficiency (depending on the design target, efficiency is important — expert vs. novice), Versatility(can be used to perform variety of tasks), Ergonomics (avoid fatigue/repetitive stress injuries), Accessibility (strict realism may prevent some users from interaction), Practicality (cost, technological limitations, size, etc)

Getting a Grip on Tangible Interaction: A Framework on Physical Space and Social Interaction (2006) by Hornecker and Buur

One single rigid definition cannot fully encompass the “broad range of system and interfaces” tangible interaction involves. Instead, Hornecker and Buur shares the most commonly discussed views on Tangible interaction:

  • Data-centered View: “utilizing physical representation and manipulation of digital data” (pursued in CS and HCI)
  • Expressive-Movement-centered View: “the bodily interaction with objects, exploiting the ‘sensory richness and action potential of physical objects’” (pursued in Industrial Product Design)
  • Space-centered View: “combining physical space and objects with digital displays or sound installations,” full-body interaction and using body as interaction device is a common feature (influenced from Arts and Architecture)

While it is agreed upon scholars that TUIs are suited for collaborative scenarios, many previous attempts at conceptualization (such as the above) only focuses on defining terms, categorizing systems, and evaluating for individual use. As emphasized in Dourish’s book “Embodied Interaction,” it is important to understand the social and cultural context within interaction. Therefore, Hornecker and Buur points out the lack of and argues the need for framework “analyzing and understanding the social aspects of tangible interaction and design knowledge on how to design so as to support social interaction and collaboration.” Hornecker and Buur suggests their framework on Tangible Interaction, which is structured around the following four themes: Tangible Manipulation, Spacial Interaction, Embodied Facilitation, Expressive Representation. By offering themes to focus on, Hornecker and Buur argues that their framework provides different perspective that could highlight different aspects of an object, unlike a rigid classification scheme.

  • Tangible Manipulation: the “bodily interaction with physical objects;” areas to focus on are Haptic Direct Manipulation(the tactile appeal), Lightweight Interactions(modularity of the procedure and rapid feedback), Isomorph Effect(metaphoric representation of/relationship between action and effects)
  • Spacial Interaction: physically embedded in space and exploiting “intuitive human spatial skills,” “spatiality” is an inherent property of tangible interfaces/actions. Concepts to focus on are Inhabited Space(meeting point of object and person), Configurable Materials (effects of shifting and configuring the space), Non-Fragmented Visibility(providing visual references and involving everyone), Full-Body Interaction (whole body as interactive device), Performance Action (communicating through body movement, performativity)
  • Embodied Facilitation: Because tangible interaction involves physical space, it is important analyze how different structure “facilitates, prohibits and hinders some actions, allowing, directing and limiting behavior.” Focus points are Embodied Constraint (the physical set-up and its effect on constraining user behavior), Multiple Access Points (engaging and allowing all users in the interaction), Tailored Representation (offering cognitive access, inviting)
  • Expressive Representation: Tangible Interaction is a physical representation of digital functions and data. Aspects to focus on are Representational significance (meaningful physical and digital representation), Externalization (aiding cognition/speech as props), Perceived Coupling(clear and natural link between action and effect)

A Taxonomy for and Analysis of Tangible Interfaces (2004) by Fishkin

In order to address the problem of comparing and contrasting different research approaches on Tangible User Interfaces(TUI) with previous definitions and taxonomy of TUI, Fishkin, in the paper suggests a new taxonomy using metaphor” and “embodiment” as its two axes.

First, Fishkin begins by defining TUI. Ishii and Ulmer define TUI as “user interfaces that ‘augment the real physical world by coupling digital information to everyday physical object and environments.” Based on this definition, Fishkin characterizes TUI in even broader scope by suggesting that TUIs are objects with following system sequence:

  1. “some input event occur”
  2. “computer system senses and alters its state”
  3. “feedback/output event via change in the physical nature.”

Although the above definition provided by Fishkin solves the problem of narrow definitions excluding some TUI systems, it is too broad that it does not give focus or clarity of analysis. Fishkin thereby adds on a two dimensional taxonomy with embodiment and metaphor as its dimensions to analyze TUIs; the taxonomy indicates that “higher the levels of these attributes in system, the more tangible it is.

Embodiment — the tie between input and output; Four levels of characteristics:

  • Full: the state of the device is fully embodied in the device; output device == input device (ex. like clay sculpting, pushes the clay, views the result on the same clay)
  • Nearby: the output takes place near/directly proximate to the input object (ex. physical brushes augmenting tabletop display)
  • Environmental: output is “around” the user (ex. typically in audio), non-graspable output although there is a tenuous link between the output and input
  • Distant: output is “over there” (on another screen or room, like TV remote control)

Applications may span on different levels of embodiment, like a toy — Platypus Amoeba — that makes sounds (environmental) but also changes the surface color(full). Fishkin points out that as embodiment increases, the “cognitive distance” between the input mechanism and output decreases. Thus, embodiment level when designing TUI can reflect the nature of the task.

Metaphor — invoking metaphorical links between real-world effect of similar actions and user action; the two types of metaphor:

  • None: no metaphor/real world analogy (like command-line input); there is no relationship or link between the input and output focus
  • Metaphor of noun:” analogy made to the physical shape/look/sound
  • Metaphor of verb:” analogy made to the act being performed
  • Noun and Verb: a related noun and verb pair with an appeal to analogy (Ex. “drag-and-drop” interface: dropping a virtual file into a virtual wastebasket “is like” dropping a physical file into a physical bin)
  • Full: virtual system IS physical system, that no analogy needs to be made (Ex. using a pen computer; the stylus on the document IS altering the document)

There are many different aspects to consider about when designing metaphors in a cultural, psychological, industrial, cognitive, and even philosophical level.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.