Don’t teach computational thinking, teach to think about computers

David Young
6 min readJun 13, 2016

--

This spring I kept seeing the expression “computational thinking” on the web. Stephen Wolfram had used it on his blog. The Champaign-Urbana Community Fab Lab website mentioned it. I started to think about what computational thinking is: is it a meaningful category? Does it encompass good problem-solving skills? Should it be taught in schools?

I started hunting for origins and definitions. Here and there I found the term ascribed to Seymour Papert, the inventor of Logo. That piqued my interest. Then I found this definition, provided by The Center for Computational Thinking at Carnegie Mellon University, which made me raise an eyebrow:

Computational Thinking is the thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information-processing agent.

Information-processing agents, huh? That was not very helpful. What was it really about? I looked for answers in the bullets under the definition:

Computational thinking is a way of solving problems, designing systems, and understanding human behavior that draws on concepts fundamental to computer science. To flourish in today’s world, computational thinking has to be a fundamental part of the way people think and understand the world.

A practice that draws on fundamental computer science concepts to understand human behavior? I grew suspicious: computer science is notorious for neglecting human factors. The Center went on:

Computational thinking means creating and making use of different levels of abstraction, to understand and solve problems more effectively.

Computational thinking means thinking algorithmically and with the ability to apply mathematical concepts such as induction to develop more efficient, fair, and secure solutions.

Computational thinking means understanding the consequences of scale, not only for reasons of efficiency but also for economic and social reasons.

That sounded like mathematics and science to me. The Center had not done a great job at drawing a distinction between “computational thinking” and other ways of thinking. I risked taking “computational thinking” more seriously than it deserved, and read further:

Users of the Internet, when empowered with computational thinking, can demystify privacy technologies and surf the web safely.

That did not sound empowering at all. Web safety is computer science’s burden to bear. Computational thinking, in the Center’s example, seemed to shift the burden to users of the Internet. Considered together with the web safety quote, this prior quote:

To flourish in today’s world, computational thinking has to be a fundamental part of the way people think and understand the world.

made computational thinking advocacy sound vaguely threatening, as if the advocates were saying, industry is using digital technology to rewrite the operating system of the world, and if you will remain compatible with the world, then you had better start thinking in new ways.

I was ready to reach a tentative conclusion: computational thinking is backwards. Its advocates ascribe to computer scientists all of the good habits of scientists and mathematicians, ignore the computer scientist’s bad habits, and tell us that we should all think like computer scientists. Never mind that applied computer science frequently yields the least transparent, scrutable, concrete, simple, safe, quick, and durable solutions to many problems, from television to thermostats, telephony to automobiles. Forget that computers and software have adapted slowly to human nature, and in some ways their new designs are retrograde. It is the nature of computational thinking that you ignore, forget, and never mind.

Speaking of forgetting, consider the expression “computer literacy.” I think that when the expression was new, it was the “computational thinking” of its day. Both of the expressions are concerned with the changes brought by ubiquitous computers, and the adaptations that are required. But if you seriously examine the idea of computer literacy, you see that it is no more a new form of literacy than computational thinking is a new form of thinking. Ordinary literacy, the (usually) permanent condition of knowing how to read and to write, is access to knowledge and culture, forever, including access to computer manuals and other foreign-language instruction, street signs, stone tablets, and books about computer literacy. Computer literacy is not nearly as permanent or expansive as ordinary literacy, and I think that if we’re honest about what constitutes computer literacy in practice, then we admit that it’s a mixture of barebones technical proficiency, dumb beliefs, and ugly attitudes: scorn for people—including oneself—who do not respect the arbitrary laws of the computer (you/I didn’t “eject” the disk before removing it, did you/I?); ritualistic and superstitious responses to technology (turning it off and on again; clicking experimentally all over the screen); and chronically low expectations about computer usability and usefulness. In other words, computer literacy prepared people to cope with computers without criticism. Computational thinking is the next phase.

I want to offer, as an alternative to computational thinking and computer literacy, some ideas for improving the computing curriculum: things that students of all ages could be taught about computing, but that schools (including computer science programs at colleges and universities) don’t teach very often or well. With some experience learning and teaching these subjects, students and teachers may agree that “computational thinking” is not such a singular or important topic, after all.

What are computers for? When computers and people cooperate, what should the division of labor be? Students should be taught that these are valid questions to ask at home, in the workplace, and at the ballot box. Students should be exposed to a variety of options and opinions and shown how choices are always made in one way or another by people who may or may not have foreseen or engaged positively with moral questions and consequences.

The history of computers. Computing suffers from a myth of linear progress. Computing history is seldom searched for ideas and inspiration. The historical contingencies, competing visions, and decision-making processes that shaped computing are not widely known. To understand computing’s promise and pitfalls through the years, and to avoid duplication of effort and missteps in the future, students need to be taught the field’s history.

Alternative embodiments of computing. Today, computing is frequently embodied in multi-purpose silicon microprocessors, but it has not always been so, and it is naive to believe that it always will be so. The Antikythera mechanism, tide-predicting machines, and the Norden bombsight; long division and nomograms; and the classic Honeywell thermostat were computation embodied, respectively, in clockwork; pen (or straightedge) and paper and person; and a bimetal strip. Future computing may be quantum, optical, or biological. Exposing students to many embodiments of computing gives each student more opportunities to find a personally appealing math or science topic than a course in computational thinking does, and it provides more fodder for the imagination.

Human ability and difference. Computers and software are worse than they should be because human psychology does not figure much in their production or evaluation. Ordinarily a programmer can complete their formal education without learning a whit about psychology. Consequently, for many engineers, the model end-user resembles a computer: infinitely patient, having an unrealistically large and reliable memory, producing regular “outputs,” and perceiving small input changes with utmost speed and perfection. Mainstream programming systems are made by and for people who enjoy (or tolerate) thinking in terms of symbols tenuously connected to numbers, texts, and behaviors. Programmers are often expected to simulate in their heads the computer’s fast, perfect interpretation and manipulation of symbols, hence the incredible concentration that writing and debugging computer programs requires. Teachers need to challenge students to think critically about the differences in personality and intelligence among computer users, to think about ways of knowing and understanding besides the abstract-symbolic, and to scientifically evaluate claims about the minds and behavior of users.

Even if 21st century life is more (or less) bewildering, taxing, physically or economically precarious than earlier times, and even if computing has something to do with the change, computational thinking does not rate as a “fundamental” school subject. It’s nothing new: little distinguishes computational thinking from mathematics and science. And it is based on a field, computer science, whose outlook is too narrow for us to consider it fundamental. Psychology, philosophy, and history of computing, computing in different materials and mechanisms, not to mention design: these subjects, which are key to understanding when and how to make computers work for people, are the poor relations of the traditional computer science curriculum. They need to be taught with urgency and vigor, to show students how history shaped the computing present, to show them that countless computing futures are within reach, and to prepare them to guide us to a productive and fair future of computing.

--

--

David Young

Engineer in Urbana, Illinois, interested in resilient, usable software systems.