Telesthetic Prehension: Plant Pots

Maria S
The UX Happenings
Published in
5 min readFeb 21, 2021

Design a digital and physical object that prehends its surroundings

Group members 🤝: Sanya Nayar, Rikkie Xiong, Kate Chernysheva and Maria Shuttleworth

<<< For my previous project on Telesthetic Prehension, click here.

The initial idea.

As I described in my previous post, we wanted to explore projection mapping to make the emotions of our plant pots visible.

The next step was for each of us to assign a face to our pots that could express the object’s feelings clearly to humans. We each drew one, along with a few different facial expressions to show different states.

Plant pots and faces (Rikkie: Left, Kate: Middle, Sanya: Right)

I animated a few of these and projected them onto my wall, as well as my plant pot (which I stacked ontop of a suitcase and a box). Eventually, the plan would be to project everyones plant pots next to my own, and have them interact somehow.

A quick sketch I did of how the project would look.

We thought it would be interesting to have the eye follow you around, and change its emotion as you interact with it. We also thought about connecting it to a weather app and for it to change states depending on the weather but this was too difficult to do in the time that we had.

In addition to time restrictions, I discovered that a similar product has already been created. Lua (Mu Design, 2021) is a plant pot that realises its plants prehensions in a way that it can communicate to the user (via a screen). It has 5 senses: moisture, sun exposure, temperature, movement and QR code reading, and utilises motion tracking to follow movement with its eyes. It is also networked via an app.

My plant pot blinking.

So, we decided to stray away from this path and focus more on the interactions of the pot with its neighbouring pots. We wanted to create a “plant pot server” where our pots joined up and interacted. The idea was that we did things to the plant pots that caused them to change emotion.

Next steps.

We discussed a timeline for the video, with plant pots arriving at different times to join my pot. Each one has different moods that are affected by what we do to them in real life. We planned for certain interactions to happen with the plant e.g. Rikkie waters hers, and I shine a laser on mine.

These mood changes were based on an interview Kate held with her mum (an avid plant keeper). She talked about what the plant pots might feel, as opposed to just the plant:

  • Annoyed when plant is watered as the pot wants to stay dry.
  • Anxiety & pain when the plant grows.
  • Confident when being painted.
  • Fear when feeling unwanted, thrown away, or kept in the dark.
  • Concentrated when the plant roots are growing as the pot needs to keep them together and prevent them from bursting through.

Thursday feedback.

Our first prototype.

After presenting our ideas, we got some feedback. First of all, we were told that the faces were a bit too explicit, leaving less room for interpretation. Tyler said that we should make room for Closure: aka the “space between panels”, referencing Understanding Comics (McCloud, 1998). This made us think about alternate ways to represent emotions and changes of states.

Tyler also mentioned that exploring ways of matching the prehensions between the pot and plant would really elevate our project e.g. you need a specific prehending pot for the specific plant to match ways of being in the world to new outcomes.

Finally, they liked our idea of networking plant pots but thought that exploring more about how the pots communicate with one another would be the next step.

So we went back to the drawing board. We wanted to express the emotions in a more abstract way: colours and patterns.

Kate animated some patterns to help further express the emotions:

Animated patterns (Credit: Kate).

Rikkie then looked into the colours we would use following Plutchiks colour wheel (Donaldson, 2017), and Sanya looked into music to represent each mood.

Research into colour (Credit: Rikkie)

Then, following the timeline of events we had decided upon, I combined all of these elements into a 1 minute video to be projected onto my plant pot and the wall around it.

The final version of our project that I put together in After Effects using the elements everyone had collectively created.

In the video you see my pot starting off alone, blue and lonely. Kate’s pot joins and it becomes happy, which is reflected in Kate’s pot. Different emotions are caused by actions we do in real life e.g. while Rikkie’s pot was sad, this was become she was watering it on screen. Each emotion was then picked up by the neighbouring plant based on proximity.

In the above video you can see our group to the right of the video, doing actions with our plants in real time with the video to affect the plant pots.

Feedback.

The group liked our decision to go with the mixture of colours and lines over the faces as they felt it added an extra level of mystery. They also agreed that we focused well on the pot, and hadn’t seemed to have added any extra emotion from ourselves.

However people agreed that we should have made it more detailed, explaining why the moods of each plant influences the other, for example. Also, the human-pot relationship could have been explored further, digging deeper into the changes in state that indicate prehension.

John noted that we ended up designing a system of things, and that if this is what we would have explored further, our project would have been more insightful e.g. we designed a system for “flows of matter” — flow of matter between the pot, soil and plant.

If I could continue this project then I would want to explore the flow of matter system idea more. I think the connection and the prehensions within this system would be interesting to discover more about. I felt like even after several revisions, our idea was too vague and this would be the perfect direction to go in to streamline the experience.

For the next project on Macro UX, click here >>>

References.

  • Akiyama, K., 2017. Digital Animism: Through Looking at the Meaning within Noise | Interactive Architecture Lab. [online] Interactivearchitecture.org. Available at: <http://www.interactivearchitecture.org/digital-animism-through-looking-at-the-meaning-within-noise.html> [Accessed 6 February 2021].
  • Donaldson, M., 2017. Plutchik’s Wheel Of Emotions — 2017 Update • Six Seconds. [online] Six Seconds. Available at: <http://www.6seconds.org/2017/04/27/plutchiks-model-of-emotions/>.
  • Lindley, J., Akmal, H. and Coulton, P., 2019. Design Research and Object-Oriented Ontology. Open Philosophy, 3(1), pp.11–41.
  • McCloud, S, 1998. Understanding comics: The invisible art. IEEE Transactions on Professional Communications, 41(1), pp.64–71.
  • Mu Design. 2021. Lua — Mu Design. [online] Available at: <https://mu-design.lu/lua#in-more-detail> [Accessed 6 February 2021].

--

--

Maria S
The UX Happenings

Personal blog for MA User Experience Design at University of Arts London