Telesthetic Prehension (Part 2 of 2) — Collaborative Unit

Sanya Thapar
Sanya Nayar
Published in
6 min readMar 24, 2021

In the first leg of the project, we found inanimate objects to study their responsiveness to external stimuli and observe how they prehended their surroundings from the lens of animistic faith. We then conducted artefact analysis for these objects and personified them with characteristic attributes based on the Myers Briggs Personality Test (MBTI).

Team: Huijie Xiong (Rikkie), Kate Chernysheva, Maria Shuttleworth and Sanya Nayar

From top-left to bottom-right: Maria, Sanya, Kate and Rikkie.

The big question that arose afterwards was: how to make members of this network of prehending objects commune telesthetically?

It needed to be addressed in an intelligent way given that all of us in the team were located remotely and at the best, could make use of video conferencing tools for staying connected. Thus the challenge lied in finding a method to manipulate physical objects across each others’ screens. At this point, the virtual world would mingle up with the physical world.

I came up with the idea of using projection mapping on the objects. That meant that one person could send out light signals from their screen which could be projected onto the object inside other person’s screen.

In the tutorial on Tuesday, we discussed our progress with John. He liked our concept of animism and the effort of analysing objects in detail. He also encouraged the direction of pursuing projection mapping for telesthetic communication. We were suggested to look up a website madmapper.com which offers a software to implement video and light projections onto objects easily using a projector.

We started experimenting with different patterns and alignments of projections. Maria tried a few using the software while I made a prism that produced holographic image/video from converting a customisable base multimedia of a specific format.

Trials by Maria using MadMapper.
Experimentation with holographic type of projection by me.

We redefined our concept by choosing a network of only one kind of object and decided collectively that it would be a plant pot since that was commonly available within everyone’s reach.

These pots behaved in different ways depending on how they were treated. We all found some plant-pots in our homes (I bought it as I had none before) and made faces for them expressing emotions similar to humans’ (for example,☻for happy; ☹ for sad). These were animated by Maria using Procreate and later on converted into projection files using Madmapper.

Some faces illustrated by Kate and Rikkie.
I hand-drew these facial expressions and scanned them.
Animations rendered by Maria.

Forthwith, we moved from the idea of projecting light from one screen on the object contained inside another → to → having all the pots (virtual) displayed side by side at one place (Maria’s room wall) and performing different activities at our end that would influence the pot. Due to technological limitations, we resorted to recording the animation-script beforehand which meant that the resultant impact wouldn’t be realtime instead, fabricated.

In our interim presentation on Thursday, we received two extremely helpful suggestions:

1.) Alaistair remarked that the relationship of each individual pot with the plant it carried should have been called attention to.

2.) John brought into our notice that the faces used on pots were too representational leaving less room of interpretation. We needed to mix abstraction in expressions.

Acting on both of these advices, we made the following changes for our final presentation:

The relationship of pot with the plant it carried formed to be a stimulus evoking different emotions in the pot’s mind/heart. We developed storyboards illustrating these and came up with pot personas.

1.) Storyboard by Maria; 2.) Storyboard by Rikkie.
Storyboarding done for my Primula plant-pot.

Secondly, we focussed on an alternative representation for the plant-potss’ emotions. In a trice, I remembered about the movie Avatar which showed organisms communicating to each other through electrochemical signal exchanges taking place between their roots. Doing bit of a research, I found that this fictional-fabrication was almost true.

“All the trees are connected to each other through underground fungal networks. Trees share water and nutrients through the networks, and also use them to communicate. Scientists call these mycorrhizal networks.” — Peter Wohlleben, German forester

We extrapolated this knowledge to solving communication between pots of these plants.

We created patterns depicting wave-frequency motion to represent electrochemical exchanges. As these electromagnetic waves were basically audio signals, corresponding sounds were found for each pattern. Furthermore, sounds have associated colours to them.

All in all, an interlinked model of frequency-sound-colour was created to portray plant-pots’ emotions.

We also looked into Robert Plutchik’s wheel of emotions to learn about interrelationship of colours and emotions.

Plutchik’s wheel of emotion and an example of how colours affect emotions. Images retrieved from https://www.wikiart.org/en/edvard-munch/death-in-the-sickroom-1893
Wave animations created by Kate.

We thought that the effect of proximity can be shown by plant-pots prehending each others’ auras. This added another layer of interactivity.

Aura colour transitions in accordance with Plutchik’s model, created by Rikkie.
Spotify playlist of songs for plant-pot emotions created by me. The music was played in the animation background.
Alert-notification signages designed by Kate.

REELING IN REAL-TIME

For the final presentation on Friday, we recorded a 1-minute animation video showing interaction interwoven between the four plant-pots. We kept a close eye to initiate performance at the time when the respective pots made entry on screen.

Scenography:

First entrant, Henry — Maria’s plant-pot, shown to be lonely. 😔 She casted laser light in realtime.

Second entrant, Xia — Rikkie’s plant-pot, shown to be happy. 😁 She watered her plant in realtime.

Third entrant, Morgan’s Beauty — Kate’s plant-pot, shown to be angry. 😠 She pretended to cut plants’ leaves in realtime.

Fourth entrant, Oorvi — Sanya’s plant-pot, shown to be calm. 😇 I cleaned my pot with a cloth and surface cleaner in realtime.

The auras kept spreading in between the transitions, ending in a big yellow circle to show that all is well, that ends well!

Video recording of the final presentation © Maria.

FEEDBACK

Ou classmates responded really well to the changes that we made regarding the abstraction of emotions.

John appreciated this semiotic generosity that brought a balance between abstraction and figuration.

Alaistair made a very critical point that skipped crossing our minds earlier: showing the case exemplifying neutrality of relationship between the plants and pots.

Ines commented that the pots prehending their relationship with the plants in addition to each others’ presence was a very positive development.

However, David said that he would have liked to see in detail how the plants had influenced the pots.

On the whole, mise-en-scène of the Zoom setup was much appreciated by everyone.

BIBLIOGRAPHY

Falik O, Mordoch Y, Ben-Natan D, Vanunu M, Goldstein O, Novoplansky A. Plant responsiveness to root-root communication of stress cues. Ann Bot. 2012;110(2):271–280. doi:10.1093/aob/mcs045

Harvey, G. (2013). The handbook of contemporary animism. Routledge.

Plutchik, R. (1982) ‘A psychoevolutionary theory of emotions’, Social Science Information, 21(4–5), pp. 529–553. doi: 10.1177/053901882021004003.

Sutton, T.M., Altarriba, J. Color associations to emotion and emotion-laden words: A collection of norms for stimulus construction and selection. Behav Res 48, 686–728 (2016). https://doi.org/10.3758/s13428-015-0598-8

--

--

Sanya Thapar
Sanya Nayar

MA User Experience Design at the University of Arts, London