What Ever Became of the Information Age?
You may remember having heard the term “information age” — but it is entirely possible you have only a vague notion of what it meant. This may be because it has been a long time since you heard it last, but also because the term is slippery, having many usages.
Like the terms “atomic age,” “jet age,” “space age” “information age” can mean an era in which a revolutionary technology has arrived on the scene — and while “information technology” is not really new (writing, and even spoken language, is describable as an “information technology”) there is no question that the significance of the electronic computer and its associated communications systems, in their various forms, represented something different from what came before. And indeed the information age came to pass in this sense.
Like the term “Industrial Age” “Information Age” can also denote a shift in the particular, fundamental conditions of work and consumption. The industrial age saw the decline of the rural, agrarian, peasant way of life as the norm as a revolutionary, inanimate energy-powered machine-based form of mass manufacturing became the predominant condition of our existence (employing a quarter of the American labor force at mid-century, while overwhelmingly accounting for the rise in material output and living standards). Likewise the information age held out the prospect of a great increase in the work effort devoted to, in one way or another, producing, processing and communicating information — as the volume of information being produced, processed and communicated exploded. And this, too, did come to pass.
However, the term had other meanings. Of these the one that was most exciting — because it was the one that could really, really make it matter in a way that would merit speaking of A New Age — was the idea that information itself, which has always been substitutable for other economic inputs like land and capital and labor, and substituted for them (this was how the Industrial Age happened, after all, the technical know-how to exploit those energy sources and build all the other machines enabling eventually massive labor substitution)information would become so much radically substitutable for everything else that we would in this respect altogether transcend the smokestack, raw material-processing, secondary sector-centered Industrial Age. Thus, if the supply of some good ran short, information-age INNOVATION! would promptly turn scarcity into abundance, with what was promised for nanotechnology exemplary (the radical new materials like carbon nanotubes that would be stronger and lighter and better than so much else, the molecular-scale assemblers that, working atom by atom, would waste not and leave us wanting not). Increasingly suspending the bad old laws of “the dismal science,” this would explode growth, even as it liberated growth from reliance on natural resources and the “limits to growth” they imposed, solving the problem of both material scarcity and our impact on the natural environment — socially uplifting and ecological at once. Indeed, thinkers came to speak of literally everything in terms of “information,” of our living in a world not of matter and energy but information that we could manipulate as we did lines of computer code if only we knew how, as they were confident we would soon know how, down to our own minds and bodies (most notoriously in the mind uploading visions of Ray Kurzweil and other Singularitarians).
In the process the word “information” itself came to seem fetishistic, magical, not only in the ruminations of so-called pundits mouthing the fashionable notions of the time, but at the level of popular culture — such that in an episode of Seinfeld in which Jerry’s neighbor, the postal worker Newman, wanting to remind Jerry that he was a man not to be trifled with, told him in a rather menacing tone that “When you control the mail, you control information.”
The line (which has become an Internet meme) seemed exceedingly contemporary to me at the time — and since, as distinctly ’90s as any line can get, precisely because, as I should hope is obvious to you, the information age in this grander sense never came to pass. Far from our seeing magical feats of productivity-raising, abundance-creating INNOVATION!, productivity growth collapsed — proving a fraction of what it had been in the heyday of the “Old Economy” at which those lionizers of the information age sneered. Meanwhile we were painfully reminded time and again that at our actually existing technological level economic growth remains a slave to the availability and throughput of natural resources, with the cheap commodities of the ’90s giving way to exploding commodity prices in the ’00s that precipitated a riot-causing food-and-fuel crisis all over the world. If it is indeed the case that the world is all “just information,” to go by where we are in 2022 (in which year we face another painful reminder of our reliance on natural resource as the war in Ukraine precipitates yet another food-and-fuel crisis) the day when we can manipulate matter like microcode remains far off.
Unsurprisingly the buzzwords of more recent years have been more modest. The term one is more likely to hear now is the “Fourth Industrial Revolution” — the expectation that the advances in automation widely projected will be as transformative as the actually existing information age may plausibly be said to have been — but not some transcendent leap beyond material reality.
I do not know for a fact that a Fourth Industrial Revolution is really at hand — but I do know that, being a rather less radical vision than those nano-assembler-based notions of the ’90s, the thought that it may be so bespeaks how even our techno-hype has fallen into line with an era of lowered expectations.
Originally published at https://naderelhefnawy.blogspot.com.