Critique of the cat

On the 29th of February we presented our final prototype in the screen-based part of the course.

My partner and I had developed a program that should evoke peace of mind through interaction with it. This resulted in a simple one pixel program modeled on interaction with a purring cat. Caressing the device (a tablet) in a gentle way, would lead to a pulsing expression.

The program can be tried here (only in Google Chrome):

http://csiv.dk/dia/PeaceOfMind/

Depending on screen size and mouse sensitivity the reaction might be more or less easy to provoke, but that will be explained in depth later on.

Zoomorphism

Choosing a cat as a model for the interaction introduces certain challenges that you have to address. Certainly a one-pixel expression is very different from a cat, but how can we then claim that the cat has been our model. According to Djajadiningrat, Matthews & Stienstra (2007, p. 669) a frequent assumption is that “[…] for products to have an emotional expression they must […] resemble humans or animals in their bodily configuration or appearance”. However “it is the motion kinematics and not the featural properties that are responsible for perceptual animacy” (Djajadiningrat, Matthews & Stienstra, 2007, p. 669).

While the form, the warmth, the way you place your finger are all different, it is possible to mimic the interaction attributes of interacting with a real cat. We are thus able to create an expression that is appropriate to the context of our project without the connotations that would come with implementing physical cat features in the project.

The presentation — how it went wrong and how that was right

The description of the prototype this far has naturally been done in a somewhat idealized way. However during the presentation everything did not go as smooth as planned.

Although we had checked that everything was in a working state before the presentation, we were suddenly not able to get a reaction from the program, during the presentation. We started by letting the other groups stroke the prototype, but nothing happened, causing a little frustration and a little panic in us. We saw that other were not stroking the same way we were. They were doing it faster or slower with shorter strokes or in other areas of the screen.

We took over trying to show how it was supposed to be done. Still no effect. We ended up using a computer instead to showcase program, which for some reason worked.

After the first presentation we sat down and discussed what was wrong. We looked through the code, uploaded it again and suddenly it started working. At the next presentation we were able to get it to work. We were, but nobody else.

While creating the prototype we had continuously experimented with the interaction. Building up a technique in tandem with the development of the program. When finished we had developed a sophisticated mental model (Norman, 1988) of the prototype, that enabled us to engage with it in the correct way. But the a lack of feedforward and feedback (Wensveen, Djajadiningrat & Overbeeke, 2004) made it difficult for other users to assess the correct way of the engaging with it. It might have been the same problem, that rendered us unable to get it to work during the first presentation.

While this is an unforeseen consequence of the constraints we put into the program, it is also part of creating a certain feeling in the interaction. Getting the program to react should not be totally effortless, a little difficulty also increases the rewarding feeling of reaction when it happens. This is a difficult balancing act, but requiring a certain amount of finesse in the interaction is part of creating the soothing experience that we strive for.

Emergent behavior

After looking closely at our own interaction with prototype, we also found an emergent behavior, that we hadn’t planned for, but arose from the particular constraints we had built into the program. We saw that those who interacted “skillfully” with the prototype were aligning their movement with the rhythm of the sine curve. This seemed to be both an effective technique in getting the program to run, but also created some feeling of being in sync with the program. This synchronization is not required by the program. Moving your hands out of time with the color changes would not have an effect on the technical side of the interaction, but we could see that it most certainly had on the interaction gestalt (Lim, Stolterman, Jung & Donaldson, 2007).

Despite its shortcomings in terms of learning how to interact with it. We did manage to shape the interaction gestalt of our prototype in a way that required a certain type of slow, and rhythmical interaction within a certain context. Whether it did instill peace of mind in the user is hard to evaluate, but assuming that the user knew the rules of the prototype, it did in our experience lead to a interaction not unlike that of stroking a cat or caressing a loved one.

References

Djajadiningrat, T., Matthews, B., & Stienstra, M. (2007). Easy doesn’t do it: skill and expression in tangible aesthetics. Personal and Ubiquitous Computing, 11(8), 657–676.

Lim, Y. K., Stolterman, E., Jung, H., & Donaldson, J. (2007, August). Interaction gestalt and the design of aesthetic interactions. In Proceedings of the 2007 conference on Designing pleasurable products and interfaces (pp. 239–254). ACM.

Norman, D. A. (1988). The psychology of everyday things. Basic books.

Wensveen, S. A., Djajadiningrat, J. P., & Overbeeke, C. J. (2004, August). Interaction frogger: a design framework to couple action and function through feedback and feedforward. In Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques (pp. 177–184). ACM.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.