Computational thinking and the next wave of data journalism
In this second extract from a forthcoming book chapter I look at the role that computational thinking is likely to play in the next wave of data journalism — and the need to problematise that. You can read the first part of this series here.
Computational thinking is the process of logical problem solving that allows us to break down challenges into manageable chunks. It is ‘computational’ not only because it is logical in the same way that a computer is, but also because this allows us to turn to computer power to solve it.
As Jeannette M. Wing puts it:
“To reading, writing, and arithmetic, we should add computational thinking to every child’s analytical ability. Just as the printing press facilitated the spread of the three Rs, what is appropriately incestuous about this vision is that computing and computers facilitate the spread of computational thinking.”
This process is at the heart of a data journalist’s work: it is what allows the data journalist to solve the problems that make up so much of modern journalism, and to be able to do so with the speed and accuracy that news processes demand.
It is, in Wing’s words, “conceptualizing, not programming” and “a way that humans, not computers, think.”
“Computers are dull and boring; humans are clever and imaginative. We humans make computers exciting. Equipped with computing devices, we use our cleverness to tackle problems we would not dare take on before the age of computing and build systems with functionality limited only by our imaginations” (Wing 2006)
And it is this — not coding, or spreadsheets, or visualisation — that I believe distinguishes the next wave of journalists. Skills of decomposition (breaking down into parts), pattern recognition, abstraction and algorithm building that schoolchildren are being taught right now. Imagine what mass computational literacy will do to the news industry.
Nicholas Diakopoulos‘s work on the investigation of algorithms is just one example of computational thinking in practice. In his Tow Center report on algorithmic accountability he outlines an approach to reverse-engineer the ‘black boxes’ that shape how we experience an increasingly digitised world:
“Algorithms must always have an input and output; the black box actually has two little openings. We can take advantage of those inputs and outputs to reverse engineer what’s going on inside. If you vary the inputs in enough ways and pay close attention to the outputs, you can start piecing together a theory, or at least a story, of how the algorithm works, including how it transforms each input into an output, and what kinds of inputs it’s using. We don’t necessarily need to understand the code of the algorithm to start surmising something about how the algorithm works in practice.”
But the next wave of data journalism cannot just solve the new technical problems that the industry faces: it must also “problematise computationality”, to use the words of David M. Berry:
“So that we are able to think critically about how knowledge in the 21st century is transformed into information through computational techniques, particularly within software.”
His argument relates to the role of the modern university in a digital society, but the same arguments can be made about journalism’s role too:
“The digital assemblages that are now being built … provide destablising amounts of knowledge and information that lack the regulating force of philosophy — which, Kant argued, ensures that institutions remain rational.
“… There no longer seems to be the professor who tells you what you should be looking up and the ‘three arguments in favour of it’ and the ‘three arguments against it’.”
This is not to argue for the reintroduction of gatekeepers, but to highlight instead that information is not neutral, and it is the role of the journalist — just as it is the role of the educator — to put that information into context.
Crime mapping is one particularly good example of this. What can be more straightforward than placing crimes on a map? As Theo Kindynis writes of crime mapping, however:
“It is increasingly the case that it simply does not make sense to think about certain types of crime in terms of our conventional notions of space. Cybercrime, white-collar financial crime, transnational terrorism, fraud and identity theft all have very real local (and global) consequences, yet ‘take place’ within, through or across the ‘space of flows’ (Castells 1996). Such a-spatial or inter-spatial crime is invariably omitted from conventional crime maps.” (Kindynis 2014)
All this serves to provide some shape to the landscape that we are approaching. To navigate it we perhaps need some more specific principles of our own to help.
In the third and final part of this series, then, I want to attempt to build on Kovach and Rosenstiel’s work with principles which might form a basis for data journalism as it enters its second and third decades.