Tangible User Interfaces: Past, Present, and Future Directions

By Orit Shaer and Eva Hornecker

Apurva

--

“We live in a complex world, filled with myriad objects,tools, toys, and people. Our lives are spent in diverse interaction with this environment. Yet, for the most part, our computing takes place sitting in front of, and staring at, a single glowing screen attached to an array of buttons and a mouse.”

For a long time, it seemed as if the human–computer interface was to be limited to working on a desktop computer, using a mouse and a keyboard to interact with windows, icons, menus, and pointers (WIMP). Tangible User Interfaces (TUIs) is an emerging post-WIMP interface type that is concerned with providing tangible representations to digital information and controls, allowing users to quite literally grasp data with their hands.

Origins of TUI

  1. The development of the notion of a “tangible interface” is closely tied to the initial motivation for Augmented Reality and Ubiquitous Computing.
  2. In 1995, Fitzmaurice et al. introduced the notion of a Graspable Interface, where graspable handles are used to manipulate digital objects. Their aim was to increase the directness and manipulability of graphical user interfaces.A block is anchored to a graphical object on the monitor by placing it on top of it. Moving and rotating the block has the graphic object moving in synchrony. Placing two blocks on two corners of an object activates a zoom as the two corners will be dragged along with the blocks. This allowed for the kinds of two-handed or two-fingered interactions that we nowadays know from multi-touch surfaces.
  3. Ishii and his students presented the more comprehensive vision of Tangible Bits in 1997.The aim was to make bits directly accessible and manipulable, using the real world as a display and as medium for manipulation – the entire world could become an interface. Ambient displays on the other hand would represent information through sound, lights, air, or water movement.
  4. TUI = Graspable objects + Ambient Media
  5. The change of term from graspable to tangible seems deliberate.Whereas “graspable” emphasizes the ability to manually manipulate objects, the meaning of “tangible” encompasses “realness/sureness”, being able to be touched as well as the action of touching, which includes multisensory perception:

“GUIs fall short of embracing the richness of human senses and skills people have developed through a lifetime of interaction with the physical world. Our attempt is to change ‘painted bits’ into ‘tangible bits’ by taking advantage of multiple senses and the multimodality of human interactions with the real world. We believe the use of graspable objects and ambient media will lead us to a much richer multi-sensory experience of digital information.”

Precursors of TUI

These addressed issues in specific application domains such as architecture, product design, and educational technology.

  1. Sloth Machine
  2. Urp
  3. Marbles answering machine
  4. Intelligent 3D modelling

Tangible Interfaces in a Broader Context

Related research areas:

  1. Tangible Augmented reality:
    Tangible Augmented Reality (Tangible AR) interfaces combine tangible input with an augmented reality display or output.
    Ex. augmented books, tangible tiles
  2. Tangible Tabletop Interaction:
    Tangible tabletop interaction combines interaction techniques and technologies of interactive multi-touch surfaces and TUIs. Research in this field is starting to investigate the differences between pure touch-based interaction and tangible handles.
    Toolkit: reacTIVision, ex. Reactables
  3. Ambient displays:
    Ambient displays were originally a part of Ishii’s Tangible Bits vision but soon developed into a research area of its own. Blackwell suggest that tangible objects can drift between focus and periphery of a user’s attention and present an example of peripheral (and thus ambient) interaction with tangibles.
  4. Embodied User Interfaces:
    T
    he idea of embodied user interfaces acknowledges that computation is becoming embedded and embodied in physical devices and appliances.The manual interaction with a device can thus become an integral part of using an integrated physical–virtual device, using itsbody as part of the interface:

“So, why can’t users manipulate devices in a variety of ways — squeeze, shake, flick, tilt — as an integral part of using them? (…) We want to take user interface design a step further by more tightly integrating the physical body of the device with the virtual contents inside and the graphical display of the content.”

Unifying Perspectives

  1. Tangible Computing:
    It includes concepts of TUIs, Ubiquitous Computing,Augmented Reality, Reactive Rooms, and Context-Aware Devices.
    Tangible Computing covers three trends: distributing computation over many specialized and networked devices in the environment, augmenting the everyday world computationally so that it is able to react to the user, and enabling users to interact by manipulating physical objects.
    The concepts share three characteristics:
    • no single locus of control or interaction. Instead of just one input device, there is a coordinated interplay of different devices and objects;
    • no enforced sequentiality (order of actions) and no modal interaction; and
    • the design of interface objects makes intentional use of affordances which guide the user in how to interact.
    Embedding computation in the environment creates embodied interaction— it is socially and physically situated.The term tangible computing emphasizes the material manifestation of the interface (this is where tangible interfaces go the farthest) and the embedding of computing in the environment.
  2. Tangible Interaction:
    It talks about expressiveness and meaning of bodily movement and less on the physical device employed in generating this movement or the “data” being manipulated.
    The tangible interface definition “using physical objects to represent and manipulate digital data” is identified as a data-centered view because this phrasing indicates that data is the starting point for design. The expressive-movement view, in contrast, focuses on bodily movement, rich expression and physical skill, and starts design by thinking about the interactions and actions involved. In the arts, a space-centered view is more dominant, emphasizing interactive and reactive spaces where computing and tangible elements are means to an end and the spectator’s body movement can become an integral part of an art installation.
    It focuses on the user experience and interaction with a system. As an encompassing perspective it emphasizes tangibility and materiality, physical embodiment of data, bodily interaction, and the embedding of systems in real spaces and contexts.
  3. Reality-Based Interaction:
    This notion encompasses a broad range of interaction styles including virtual reality, augmented reality, ubiquitous and pervasive computing, handheld interaction, and tangible interaction
    Jacob et al. identified four themes of interaction with the real world that are typically leveraged (see Figure 3.2):
    Na¨ıve Physics: the common sense knowledge people have about the physical world.
    Body Awareness and Skills: the awareness people have of their own physical bodies and their skills of controlling and coordinating their bodies.
    Environment Awareness and Skills: the sense of surroundings people have for their environment and their skills of manipulating and navigating their environment.
    Social Awareness and Skills: the awareness people have that other people share their environment, their skills of interacting with each other verbally or non verbally, and their ability to work together to accomplish a common goal.
    To date, most TUIs rely mainly on users’ understanding of na¨ıve physics, simple body awareness, and skills such as grasping and manipulating physical objects as well as basic social skills such as the sharing of physical objects and the visibility of users’ actions. The RBI frameworks highlights new directions for TUI research such as the use of a much richer vocabulary of body awareness and skills as well as the leveraging of environment awareness skills.

Application Domains:

  1. TUIs for Learning
  2. Problem Solving and Planning
  3. Information Visualization
  4. Tangible Programming
  5. Entertainment, Play, and Edutainment
  6. Music and Performance
  7. Social Communication
  8. Tangible Reminders and Tags

Framework and Taxonomies

Framework can be characterized as providing a conceptual structure for thinking through a problem or application. Thus,frameworks can inform and guide design and analysis.

Taxonomies are a specific type of framework that classify entities according to their properties, ideally unambiguously.

  1. Graspable user interfaces (By Fitzmaurice in 1996)
    Fitzmaurice defined a graspable user interface as providing a “physical handle to a virtual function where the physical handle serves as a dedicated functional manipulator”. Users have “concurrent access to multiple, specialized input devices which can serve as dedicated physical interface widgets” and afford physical manipulation and spatial arrangement.
    a) Properties:
    • space-multiplexing,
    • concurrent access and manipulation (often involving two handed interaction),
    • use of strong-specific devices (instead of weak-general, that is generic and non-iconic),
    • spatial awareness of the devices, and
    • spatial reconfigurability.
    b) Counter-arguments:
    - Does a system need to be spatially aware under all circumstances or is it sufficient if the user keeps to certain rules?
    -Is an iconic or symbolic physical form a core requirement for a graspable interface?
    -What if the application area is intrinsically abstract and does not lend itself to iconic representations?
    -Should concurrent manipulation always be feasible?
    -How do we distinguish between the system concept and its technical implementation?
  2. The MCRit Interaction Model : (Initial conceptualization by Ullmer and Ishii in 2001)
MCRpd/MCRit model

• tangible objects are coupled via computerized functionality with digital data (computational coupling);
• the tangible objects represent the means of interactive control.Moving and manipulating objects is the dominant form of control;
• the tangible objects are perceptually coupled with digitally produced representations (e.g., audio and visuals); and
• the state of the tangible objects embodies core aspects of the entire system’s state (representational significance). (the system is thus at least partially legible if power is cut).

3. Types of TUI (Types of TUI- Ullmer and Ishii, 2005)

Interactive Surfaces: Frequently, tangible objects are placed and manipulated on planar surfaces. Either the spatial arrangement of objects and/or their relations (e.g., the order of placement) can be interpreted by the system. Ex. Urp

Constructive Assembly: Modular and connectable elements are attached to each other similar to the model of physical construction kits. Both the spatial organization and the order of actions might be interpreted by the system. Ex. intelligent 3D modeling toolkits by Aish, BlockJam and Topobo

Token+Constraint systems: combine two types of physical–digital objects. Constraints provide structure (stacks, slots,racks) which limit the positioning and movement of tokens mechanically and can assist the user by providing tactile guidance. The constraints can express and enforce the interaction syntax. Ex.Marble Answering Machine and the Slot Machine.

“Good design should thus employ successful spatial mappings, unify input and output spaces, and enable trial-and-error activity.”- Sharlin

Spatial TUI — Sharlin et al
Non- Spatial TUI — Blackwell et al

Frameworks:
In 1999, Holmquist categorised the tangible objects as tokens, containers and tools. Containers are generic objects that can be associated with any type of digital information and are typically used to move information between platforms. Tokens are physical objects that resemble the information they represent in some way, and thus are closely tied to the information they represent. Tokens are typically used to access information. Finally, tools are used to actively manipulate digital information, usually by representing some kind of computational function.

Refining this definition, Ullmer and Ishii in 2001, classified containers and tools as subsets of tokens. Token is a generic term for all kinds of tangible object coupled with digital information, and containers and tools are subtypes. This terminology has the advantage of allowing for different semantic levels of meaning for one object.

Apart from this they categorized the digital information associated with physical objects into: static media like images, dynamic media like videos and digital attributes like color,Computational operations and applications, Simple data structures, complex data structures and remote people, places and things.

--

--