Kids teach AI a little humanity with Cognimates

By Stefania Druga, Personal Robots group

Intelligent toys and conversational agents are present in children’s homes, with more than 47.5 million adults already using smart assistants (such as Amazon’s Alexa) in the US alone. This raises questions as to the impact of AI on children’s behavior. During my time in the Personal Robots group at the MIT Media Lab, the focus of my research was to better understand this generation of children growing up with AI so as to protect and encourage positive development. I created Cognimates as an open source platform for AI literacy for children 7–14 years old. While schools and parents are starting to recognize coding as one of the required literacies for children, I believe it’s important to also introduce young people to the concepts of AI and machine learning through hands-on projects so they can make more informed and critical use of these technologies.

The Cognimates platform aims to achieve that by allowing children to program and customize embodied intelligent devices, such as Alexa and the smart robot Cozmo. Children can also use the platform to train their own AI models, learn how to build a game that gets better at playing Rock Paper Scissors with them over time, or create an installation where an entire room reacts to the way they describe their dreams. Cognimates builds on multiple parts (including the visual programming language) of the Scratch open-source platform, created by the Lifelong Kindergarten group at the MIT Media Lab. The main goal of the Cognimates platform is to extend coding to AI education and literacy.

First loves: tinkering, learning, and teaching
Like many Media Lab students my path has been far from linear, but the unifying thread has always been my love for learning driven by a profound curiosity. I was born in Maneciu-Ungureni, a small town in Transylvania, Romania. My mom is a teacher and my dad is an electrical engineer. Growing up I got to discover and share their passions. Together with my dad, I learned how to design and build anything, from the furniture in my room to stuff we repurposed from old car parts we bought at fairs. With my mom, I discovered how much a good teacher could touch the lives of her students. I got to see how my mom was the soul of the community in the small village where she was teaching. She would not only help students with their academic problems, but also listen to their personal struggles and assist them and their families at a moment’s notice. From an early age, I understood how much positive impact good teachers can have in their communities, and why it is important to connect and work with people at a personal level.

Hackidemia STEAM workshop for families in Singapore 2014. Credit: Hackidemia STEAM workshop for families in Singapore 2014

I would eventually go on to combine my mom’s love of teaching with my dad’s passion for hands-on tinkering in my life’s work. I started Hackidemia, a non-profit STEM education organization, in 2012. I’d earned my first master’s degree in Media Engineering for Education and worked for Google’s Search Quality Team in Dublin for a year, before deciding I wanted to further pursue my passion for education and have a more immediate impact. I left Google and went to Cambodia to volunteer in an orphanage outside the capital of Phnom-Penh for four months. Here, I worked with kids of all ages, teaching them everything I could: math, English, literature, how to use computers and the Internet, photography, fixing things. The older children would then pass on their knowledge by teaching the younger children.

It was during this experience that I discovered how powerful it is to enable children to learn by teaching and to work on hands-on projects that are relevant in their local communities. After joining CRI, an interdisciplinary life-science research group at Paris Descartes University, I decided to start running similar workshops in French schools. While I was in the process of applying for a PhD at CRI, one of my friends told me about a summer program for social impact at NASA, called Singularity University (SU). At SU, I also met one of the people who inspired me the most in pursuing my dreams, roboticist and former astronaut Dan Barry. Dan tried to become an astronaut 10 times before he finally succeeded. When I met him, he was in charge of the hardware lab at SU and was still as joyful and as excited as child every time he got the opportunity to hack on hardware.

From left to right: Stefania Druga, Dan Barry, Libby Falk at GSP12 SU Program. Credit: TJ Rak 2012

Dan encouraged me to start running hardware prototyping workshops, and taught me how to solder and program microcontrollers for the first time. At the end of the program, I got offered a very tempting job, which I ended up refusing in order to build my own organization. I still remember Dan’s advice: “Think where you want to be 10 years from now and make sure you dedicate every single day, every single minute to get there and not get distracted along the way.”

During the SU program, while we were learning about robotics, nano-fabrication, or synthetic biology, I couldn’t help but think how much I would have liked to learn about all of these technologies when I was younger. I decided to develop Hackidemia to introduce children to the most exciting technologies and research questions in an applied and fun way. With the help of the SU network of international students and mentors, we started running workshops and organizing STEM events all over the world. Four years later, we had 40 international Hackidemia chapters, several long-term projects under our belt like Afrimakers, MakerCamp and managed to train more than 400 mentors, 2,000 students and 10,000 children. At this point, it became obvious to me that we were starting to max out our impact as a grassroots organization.

Hackidemia Workshop for children and teachers in Budapest 2015. Credit: Hackidemia 2015

Back to school: Same passions, new skills at the Media Lab
This was when I started to think about what should I do next, in order to take my mission of changing the way children learn to the next level. I wanted to get guidance, learn new skills, and work with people that share my values and drive. I knew about the Media Lab because I had used many Lab projects, like Makey-Makey and Scratch, in my workshops with kids and even had the opportunity to visit and present Hackidemia to the Lifelong Kindergarten group before. I decided I wanted to work in a place where the research gets deployed in the real world and where people value and encourage interdisciplinary approaches.

I didn’t have a specific project in mind when I was accepted into Media Lab’s masters program, but I knew I wanted to keep working on designing new creative learning experiences for children and building tools to support that. During my first semester I took a wide range of classes, from “How to make almost anything” to Human Machine Symbiosis. I started building all sorts of weird and funky projects — a giant arcade, to programmable body extensions, and a 5-axes cardboard machine for foam cutting, to name a few.

Example project for Poppy Ergo Jr robot that can be programed by demonstration to draw with the Scratch Extension developed by the author. Credit: Stefania Druga

My first AI coding project
While looking for actuators for my body extensions, I came across an open source 3D printable robot, Poppy Ergo Jr., developed by the Flowers Group at Inria Bordeaux in France (http://www.poppy-project.org). I particularly liked how this robot had encoded actuators that could record and replay any movement. I immediately started imagining how children could teach such a robot by demonstration (e.g., tech it how to draw or move like a dog).

I decided to build a Scratch Extension for this robot. This was the first Scratch extension I built in the beginning of 2017, together with my undergrad intern, Eesh Likhith. After building my first robotic extension, I thought it would be great if children could combine it with computer vision. The learning scenario I had in mind was that children would show an object to the robot’s webcam, and the robot would try to draw it based on the objects it already knows how to draw. For prototyping this interaction together with Eesh, we started to work on a new Scratch extension for computer vision that used the public Clarifai API for image recognition. We documented and published these two extensions on the ScratchX platform.

Opening the door to AI education
My first robotic and object recognition coding extensions opened the door to AI education. I started to collaborate with Randi Williams, a graduate student from the Personal Robots group who was interested in preschool AI education. Professor Cynthia Breazeal has a long history of developing educational technologies for kids as well as social robot toolkits to help children learn about coding by teaching robots. Randi and I tried to find out who else was working in this field. After we discovered there were no other relevant studies, we decided to run a series of workshops and observe how children and parents interact with and perceive AI devices and toys. We analyzed and shared our findings in a series of papersand blog posts.

Example Cognimates Teach AI platform where children can train their own classifiers with images and text. Credit: Stefania Druga 2018

During this process I realized how important it is to demystify how AI technology works and enable children to position themselves as creators of AI, not just consumers.

Custom Vision Classifier created by children with Cognimates for playing Rock Paper Scissors. Credit: Stefania Druga

Creating Cognimates was the result of this preliminary research. I was joined in this project by an incredibly talented team of undergrad students: Sarah T. Vu, Tammy Qiu, Clemente Ocejo, Eesh Likith, and Lauren Oh. They contributed both to the technical aspects of the platform and to the research in action we did in schools and community centers.

I named the platform as a tribute to Edith Ackermann’s research and work on “Animates,” or “Play things that do things.” Edith together with Sherry Turkle were the first to explore how children engage and interact with smart toys in the 1980s and 90s. Edith’s framework for establishing what allows a toy to be considered an “AniMate” served as both a guideline and inspiration for how I designed the Cognimates platform. The goal of this project and my research is to build on their wisdom and unveil what are the new guiding principles when designing learning tools and devices for this generation that is growing up with AI.

What is a Cognimate?
Initially for me a Cognimate was the embodied intelligent agent that children could program and teach. The agent would be both a friendly companion (playmate) and an object to “think with” and learn with (cognimate). Cynthia Breazeal and her students in the Personal Robots group had already been exploring this paradigm of “programming as teaching” with SoRo(The Social Robot Toolkit), as well as developing social robots as peer-like learning companions for several years. So Cognimates was a natural extension of this tradition in her research group.

The idea was also to design a platform that would allow children to connect multiple of their computational objects and make them interact with each other. During my thesis studies, we realized that children would refer to the digital characters (codelab sprites) as a Cognimate also. When talking about the projects they did, and concepts they learned, they would refer to the “make Nary happy” project rather than the “feelings detection” or “sentiment analysis” project. While many of the concepts and inner workings of AI systems and smart agents were too abstract for children initially, they could quickly understand how a machine learns if that action was embodied by a character, a story, or a game.

Examples of physical Cognimates characters: an Ogre and a frog prince developed with the kids for a creative storytelling with code project. Credit: Stefania Druga

This encouraged us to create many more characters and starter projects that could embody and manifest what the different AI services do. Some characters, like Nary, could express different emotions if the computer detected a happy or sad message. Other characters would change color to show the computer recognized a specific color; we also made a giant eye for the vision extension to show what the computer is recognizing or if it gets confused. When children are programming with a cognitive service, the digital Cognimate manifests and demonstrates how this service works (e.g., learning how to see or speak). These characters aim to create powerful analogies and conceptual bridges while allowing children to use them in relatable stories.

Example Cognimates starter project where Nary is reacting to the feelings of the messages you send it. Credit: Stefania Druga
Cognimates character Nary is reacting to the feelings of the messages you send it. Credit: Designer Mircea Dragoi, Lateral 2018

The “otherness,” artificiality, believability, friendliness, and programability of a Cognimate can lead to very rich psychological reflections, such as agency and identity, and issues of control and communication beyond helping children understand how programming and AI works.

Cognimates workshop in Elizabeth Peabody Community Center in Somerville with UROP team (from left to right) Lauren Oh, Sarah T. Vu, Tammy Qiu. Credit: Stefania Druga.

By the end of my thesis studies, I’d discovered how 107 children (7–14 years old) from four countries developed a better understanding of AI concepts and changed their perception of smart agents by programming and teaching them with the Cognimates platform. After one month of coding and training AI agents with our platform, children developed a strong understanding of AI technologies and became fluent in using them. Collaboration and communication skills played a significant role in how fast children were able to understand different machine learning concepts like computer vision, sentiment analysis, and supervised learning.

Making it up
As a junior researcher it is both exciting yet somehow intimidating to embark in a field of research that hasn’t been established yet. While Sherry Turkle, Edith Ackermann, Michael Scaife, and Mike Duuren did several studies on smart toys for kids in previous decades, the technology at the time wasn’t as advanced as it is today and the smart devices weren’t so widely present in children’s homes and schools as they are today. In this context I found myself having to design and adapt research methodologies from different fields like human robotics interaction, cognitive science, pedagogy, and psychology while trying to understand how children are growing up with AI in this day and age through empirical observations.

In the design of the Cognimates platform, I tried to combine what I considered to be the most important features of previous coding apps: intuitive blocks, access on mobile devices, connection to the physical world, and hardware devices. I also added modular coding plugins for cognitive services in addition to intuitive AI training capabilities. These are specifically designed for AI education. Children were involved as design partners in all the steps of the process.

Iterations of digital Cognimates characters based on children’s feedback. Credit: Designer: Mircea Dragoi, Lateral 2018

As Allison Druin points out, “children have so few experiences in their lives where they can contribute their opinions and see that they are taken seriously by adults.” She argues that such experiences can build confidence in children both academically and socially and produce “design-centered learning.” Experiencing the power of this kind of learning, and the joy of co-designing and building for and with children, was one of my favorite parts of this project. I consider it a critical process in designing all new and unexplored technologies that support children’s development.

The grad student balancing act
I will be very honest and say I don’t think I always balanced my workload and my life in the best possible way, which resulted in me working very long hours and weekends most of the time. It’s a common problem at Media Lab to try to do too much because the Lab provides so many opportunities and it is very hard to learn how to say no. Over time, as I became more and more focused on the development of this platform, I learned how to channel all of my other responsibilities like taking classes, doing demos, and collaborating with Lab member companies such that it would always contribute to and advance the Cognimates project in some way.

Learning how to develop and train a team of interns (UROPs) that could reliably support the project by running studies in schools or doing demos was also crucial. I tried to write blog posts and talk to the press every time we published a new study. For me it was important to share my research and engage all the different communities of parents, teachers, technology designers, and policy-makers that could help advance this field of AI literacy.

Map of Cognimates users around the world, 816 unique users,
2063 sessions, average session time 30 min. Credit: Stefania Druga 2018

I am genuinely always amazed and inspired by the things children come up with when given the freedom, tools and space for expression and discovery. When I was running studies in schools and it was hard because of administrative hurdles or because of the sheer volume of work, I would always draw energy to continue from seeing how much the children are challenging themselves and learning. When writing my thesis and getting stuck in quantitative data analysis or in researching related work, I would go back to read the transcripts from children’s discussions and interviews. That would put a smile on my face, it would remind me why I push myself so much to do this work and keep going.

The future of Cognimates, kids, and AI
I think we are in an arms race in education with the advancement of technology, and we need to start thinking about AI literacy before patterns of behaviors for children and their families settle in place. I incorporated Hackidemia as a nonprofit organization, and as part of it I will continue the work I did with Cognimates to develop a full open-source curriculum for AI education (kidsteach.ai) while collaborating with academic and industry partners. The goal is to distribute access to our AI literacy platform and learning resources in various school districts, museums, and libraries in the US and internationally while also working on teacher and family training and outreach.

Cognimates demo for Canadian Prime-Minister Justin Trudeau during his visit to MIT. From left to right: Cynthia Breazeal, Justin Trudeau, Stefania Druga. Credit: AP Press.

I want my work to continue to be informed by research and present opportunities for students to get involved, so I’ve started to collaborate with various research groups at MIT, Harvard School of Education, NYU ITP and Penn University. The most important goal for the Kids Teach AI initiatives to continue to foster opportunities for constructive conversations around AI, bringing together various communities while enabling the children to teach us and inspire us how to best make use of AI in the 21st century.

I think one of the biggest systemic challenges I faced was to convince adults to listen and learn from children. I strived to provide ways for parents and teachers to guide and be part of the conversation with children without dominating it. I think the first step in this long-term process is to actually show much children can do with AI and why it is important for them to understand it. The next step is to make it easier for parents and teachers to partake in a collaborative learning process around this technology. While young people as digital natives are very fast at picking up new technical skills, they don’t always have the maturity to make the right decisions and that’s where it is important for families and educators to step in.


Further resources

  • Growing up with AI. Cognimates: from coding to teaching machines, Stefania Druga. MIT Master Thesis 2018 (pdf)
  • Hey Google is it OK if I eat you?: Initial Explorations in Child-Agent Interaction. Stefania Druga, Randi Williams, Cynthia Breazeal, and Mitchel Resnick. IDC 2017 (pdf)
  • How smart are the smart toys ? Children’s and parents’ attributions of intelligence to computational objects, Stefania Druga, Randi Williams, Hae Won Park, Cynthia Breazeal. IDC 2018 (pdf)
  • My Doll Says It’s OK: Voice-Enabled Toy Influences Children’s Moral Decisions, Randi Williams, Christian Vazquez, Stefania Druga, Pattie Maes, Cynthia Breazeal. IDC 2018 (pdf)

This post was originally published on the Media Lab website.