I opened my talk by saying that we’re ignoring bigger problems. Credit: Petri Ihantola.

21st Century Grand Challenges for Computing Education

Amy J. Ko
Bits and Behavior

--

I’m at a wonderful time in my career where conference organizers ask me to come talk for 45 minutes about whatever I want—otherwise known as give a keynote. As a professor, I’m always eager to share my work and my opinions, and so whenever I’m asked to give an invited talk, I’m say yes if I can make the travel work.

So when I was asked to give the keynote at Koli Calling 2019, an annual conference in Koli National Park in Finland, I couldn’t resist. I’d never been to the conference, but always respected the work it published. I’m always excited to see a new part of our planet’s natural beauty, and Koli is supposed to be spectacular. And as someone passionate about the field of computing education research and its future, it was an excellent opportunity to think about the field’s future and where I think it should go.

I started by talking to my wonderful Ph.D. students about the opportunity. They had fascinating perspectives on what a good keynote can do, and what they would talk about if asked. One of the emerging themes of our conversation was just how many important conversations aren’t happening in the field, about diversity, about justice, about democracy, and about climate change. I decided that the most impactful thing I might do was reflect on these missing conversations and encourage our community to start them.

I generated a wish list of everything on my mind I might talk about. And the list was long: automation, disinformation, ethics, responsibility, racism, sexism, marginalization, change, agency, self-regulation, capitalism, poverty, and much, much more. The list, as fascinating as it was, risked being more of a hot take than a carefully reasoned argument. I spent a few more months organizing, synthesizing, and refining my position, trying to link these issues together and connect them to the concerns of computing education researchers.

Here’s a rehearsal of what I ended up with, recorded just before I traveled to Finland to give the keynote live:

The title slide of my 2019 Koli Calling keynote.

And here’s a recording I made while giving the talk on my phone (not great quality, and I was having trouble projecting to the back of the room, but authentic!):

A recording of the talk on my iPhone. Apologies for any audio and video quality issues!

You can also see the slides (80 MB; I like pretty pictures).

For those who don’t have 40 minutes to listen to my rehearsal, or an hour to hear the recorded talk and Q&A, here’s the gist of my argument:

  • Computing education research is a great community that has tackled some big problems around teaching programming in primary, secondary, and post-secondary settings to diverse audiences.
  • But the field is ignoring some of the bigger problems in the world that it’s uniquely positioned to address.
  • In my opinion, these include the global economic disruption being caused by automation, the impending chaos from climate change, the scourge of disinformation amplified by social media, and the cost of these problems falling on our most marginalized populations.
  • Computing is behind most of these problems, in that the people driving automation, promising technological solutions to climate change, amplifying disinformation in social media, and ignoring diversity, are mostly software developers, who value efficiency, convenience, neutrality, and competition over humanity, sustainability, truth, and justice.
  • Computing education researchers are in a position to change these values, by discovering how to teach the everyone about the role of computing in these problems.
  • To do this, I propose that we study, and eventually teach, four new areas of knowledge: the limits of computing, social responsibility, data literacy, and diversity literacy.
  • Why the limits of computing? Computing is powerful, but everyone should know that it cannot solve every problem, that it is often wrong, that it is not neutral, and that it is not free.
  • Why social responsibility? Everyone should understand that what they choose to do with computing has consequences; working for Twitter instead of the New York Times erodes sustainable, ethical journalism, and amplifies disinformation.
  • Why data literacy? Everyone should know that data is imperfect, that data is about the past, not the future, that data is biased, and that data is what makes computing useful, valuable, powerful, and harmful.
  • Why diversity literacy? Computing is a medium that is inherently optimized for the routine, normal, average cases, not the edge cases, the exceptions, and the anomalies that necessarily come with diversity. Everyone needs to know this limitation, so that they can envision uses of computing that do address diversity, or choose to avoid using computing altogether when they can’t.
  • Teaching programming teaches none of these things; in fact, learning to code often results in people believing that computing can do anything, that computing is only technical and not social, that it’s algorithms that are powerful, not data, and that diversity is a matter of error handling rather than justice.
  • To change any of this, computing education researchers must study how to teach all of these things in a way that transfers to everyday decisions in life and at work, and must prepare teachers to teach these ideas effectively. We’re the only community in the position to do this work.

This isn’t just an argument. I actually believe it. And my students and I are starting to do real scholarship in this space, investigating how to teach inclusive design to developers, how to shape their career choices, and how to teach data literacy in the context of machine learning. It’s a marathon, not a sprint, but I don’t think we can start soon enough.

The response to the keynote was fascinating. Some attendees asked questions about whether to add required courses or integrate into existing courses; I encouraged everyone to do even more, reconceptualizing how we even define things like data and algorithms in more social terms. Others asked about whether algorithms are simply more trustworthy than humans, and it’s just a matter of making them better; I pointed out that notions of correctness, efficiency, and errors are reductive, narrow conceptions of quality and that we should focus more on questions of justice. Some asked about how to make change in computing education and the role of culture; I admitted that change would almost certainly have to be local, culturally-aware, and individual, especially in academia. After the talk, dozens of the attendees came and thanked me for the talk, many of them admitting that it made them seriously reconsider what they are doing in their research and teaching.

What do you think of my argument? Too political? Not political enough? What am I missing? How should we move forward? I’m eager to hear your opinions, and debate them!

--

--

Amy J. Ko
Bits and Behavior

Professor, University of Washington iSchool (she/her). Code, learning, design, justice. Trans, queer, parent, and lover of learning.