AI & Culture Festival: Droid Dress Installation

Collaborative work by Cathryn Ploehn & Christopher Costes

COVID and the safety measures taken to lower the curve have changed how we occupy and share space. Places like galleries, festivals, and classrooms that were once dependent on a person’s physical presence are learning how to adapt to only digital spaces.

This shift represents a world of hardship, but also an opportunity to reimagine how people interact with these spaces. One of these opportunities is using digital space as the primary location for interaction. The tools for developing and navigating such spaces are still being written, and the possibilities are endless, with organizations such as LikeLike (http://likelike.org/shows/) offering superb examples of new ways of reimagining digital spaces. In our current timeline, when physically meeting is no longer possible, these digital spaces and tools used to make them are even more vital. To further explore this space, we’ve created a project called “Droid Dress,” looking at how to improve digital spaces with the many possibilities offered by machine learning.

In project Droid Dress, we are experimenting with machine learning (ML) tools that position the body as a controller in digital spaces. We’ve used ML in the browser (using the poseNet model from TensorFlow) to create digital costumes for everyone attending the AI and Culture Festival hosted by Molly Steenson’s AI and Culture class. At the digital costume party, people will enter and put on a “costume” of their favorite AI (fictional or real), the costumes range from Mavin the Paranoid Android to Brainiac and Siri. The AI costumes were chosen from a wide breadth of both historic a current AIs as well as by requests from those who will be attending the AI and Culture Festival.

To accomplish this, we are using the poseNet model to detect the body via webcam, mapping an image of an AI costume to the movement of the head. As more people enter the space, participants will be aligned next to each other and see their ‘costumes’ react to their body’s movements.

Already Droid Dress has shown us ways we can augment how we can present ourselves, obscure our actual body for privacy, and opportunities for play. Throughout the experience, we will continue to learn new ways machine learning can allow us to “embody” characters and enriching digital spaces.

--

--

Christopher Costes
The AI and Culture Festival 2020 | Curated by Carnegie Mellon students

Designer and Writer, Currently a Master's Candidate CMU, Formerly a Service Designer and Product Manager