Expand our physical intelligent in digital human form
This summer I have a lovely week of fusing movement data at the 14th Choreographic Coding Lab(CCL) hosted by Motion Bank and AE Lab in Chatham, UK. During this period I start to turn my ideas of human-digital form into my visual creation practice. There is many digital humans we know on the market such as gamer streaming avatars, Ai-generated online support and animated social media influencer. I live with the generation trigger emotions with those non-human things. As a Genz digital artist, I wonder how close? and how deep we are to replacing human connection with technology?
Physical intelligence is something that every human inherently possesses, for each of us constantly thinks with and through our bodies. We are all physicals experts. I contemplate how my mind, body and movement interact in the digital human with a choreographic thinking method. Rebecca Bassett Graham from Studio Wayne McGregor provided us an insights related to the Choreographic Thinking Tools(CTT), Mind and Movement. This method assist me to expand the frontiers of physical intelligence.
Start from assigning particle dancer as my starting point, and relocate the relationship of the particle from standard object to lighting source. Playing with the scale to make the scenes more obvious when the visual project on a real stage, and deconstruct the particle into firework with real time keyboard triggering. Exploring the way to exemplify my visual with some visual languages. Least but not last adding some eye candy animation elements in the landscape.
Throughout this process make me recall my 3D animator training, The Bauhaus stage workshop, during my BA degree in Shih Chien University. Both methods increase the intimacy with extended body and technology. As a digital creator is important to be able to live/dance in digital with our imagination.
Our facilitators Daniel Bisig and David Kern introduced us a database of dance sequences and notations called the Piecemaker. Inspired by the idea of the dancers’ self-awareness, I import the mocap data(FBX) of one of their participants: Amber Pansters into my particle human form. Watching how she talk about her motion capture data in the Piecemaker is quite surreal. I want to highlight the weirdness and the latency in this project. How the relationship and space distorted through the data path.
This personal work made up of glowing particles, embedded with opportunities to activate self-directed dance. Here is a picture of how we testing it on stage with talented dancers Lenataa Goka and Raianna Brown in a real time mocap suit.
Expand our physical intelligent in digital human are still to be ascertained. But this summer at A+E Lab gathered lots of amazing people believing the potential is exhilarating and enormous!
Credits: All media
The story featured may in some cases have been created by an independent third party and may not always represent the views of the institutions, listed below, who have supplied the content.
CCL Facilitators:
- Daniel Bisig and Scott deLahunta (Centre for Dancer Research and Motion Bank)
- Mark Coniglio and Eni Brandner (Troika Ranch)
- Nick RothwellHosts:
A Φ E Lab
Aoi Nakamura & Esteban LecoqParticipant Artists:
Alison Costa
Chelsi Cocking, Lenataa Goka & Raianna Brown
Chelly Jin
Clemence Debaig
Dhanush Giridhar
Georgica Pettus
Irini Kalaitzidi & Stathis Doganis
Isabel Sun
John Lucy
Max Dovey
Nirav Beni
Rebecca Evans & Simon East (& Flowfal)
Sofia Kovalenko
Tim Murray-BrowneObserver/ Participants:
Tom Scarborough
Tove Grimstad Bang
Links
More Isabel Sun