‘Dumbening’ and ‘inagency’

N. R. Staff
Novorerum
Published in
4 min readJun 25, 2024

--

Photo by Nathan Anderson on Unsplash

Ezra Klein’s interview with Anthropic head Dario Amodei, which I wrote about last time, made me think about “dumbening” and “inagency” — two terms that appear in my book.

In my last article, I quoted Amodei, who was explaining to Ezra Klein his theory as to how people come to “adopt” AI and begin liking it, wanting to use it. I ended my last piece with a story about my 80-year-old friend who loved using Siri, seeing Siri as their friend and helper. Watching my friend, I feel I am seeing evidence of how AI’s adoption is creating dumbening and inagency.

“Adopting AI” means we need no longer need exert effort to learn asnwers to — really anything — from an answer to a Life Big Question to what to eat for dinner and where to get it and how to prepare if, if any preparation is involved. We can satisfuy our curiosity over stupid little diversions: what actor was in that cute commercial? What was the name of a baseball player who in 1976 had 3 home runs in a row? What did our neighbor’s new pool likely cost? How much rain is typical in Tripoli this time of year? We no longer have questions that puzzle us: we have only anwers.

Whether the answers are correct or not is hardly the point. They’re so easy to get! And if you don’t like one answer, another, opposite answer is also at hand for you.

The above is a bit jokey, but it’s trying to make a serious point: when we need not exert any effort to find out how to do anything, or how to figure out anything, the result is that we soon lose the ability to even know how to do any of this.

This is the essence of “dumbening.” It’s the “use it or lose it” principle in action.

The example of my friend’s love of Siri is a good way into an understanding of both “inagency” and “dumbening.”

The inability to do anything to stop AI — or climate collaapse — is what I mean by “inagency.” My friend, of course, has no interest in stopping Siri — they like Siri. I believe they have no idea that seemingly small advances like Siri — I doubt they’re even familiar with the concept of AI (or even the term itself) — are part of a bigger move to AI. But people who do know, who are creating AI, like Amodei, seem just as powerless to stop AI. It’s sort of like an addiction: like the alcoholic, who knows they can stop anytime they want to. But… can they? Seems unlikely, without an intervention.

In my book I defined inagency with a quote from James Bridle, who in his 2019 book The New Dark Age wrote, “The individual is aware of the [problem] but loses all power to act in their own interest.”

He is talking about inagency. Inagency is a hallmark of our time. We like to think we are in control of almost everything, including outer space where we will go to live once we have destroyed this planet, but in fact we have almost less control over things than at any time in our long history as a species. Or to be correct, a few have control. Mostly it’s the very rich and the techies, and not infrequently these are the same people.

I want a wooden bench for my garden. Expanding my garden with pollinator native plants and shrubs is about the only thing I can figure to do anymore; something I can do that will help; something that seems under my control. So I am doing it. And it does take away the pain of helplessness for awhile.

So here I am, thinking about a small bench to sit amidst the plants. Actually I want it for the chipmunks. The have a big open hole and I want to have a little shelter for them. These kinds of things seem good to me; no downside.

So what do I do in thinking about a bench? I turn to amazon.com. I have “points” and can get some money off. I look for “wood” and would prefer cedar, but when I find one vastly cheaper with some other sort of wood — not engineered, it seems, but natural — I decide that’s the one I’ll get.
It’s not that I don’t know I’m doing something — tiny, but something — that feeds into the very consumerism/capitalism matrix that is causing all the problems. But I rationalize, because everybody rationalizes, and to not rationalize doesn’t make sense. What is one bench, after all? I could possibly fashion together some sort of a little wooden structure over the hole; it would look awful. Why do I worry about the chipmunks? Why not leave it alone?

Because I want it, and it is so easy to get!

And one bench won’t make a difference. One more stop on amazon’s route won’t make a difference to anything. I know trucks and planes add to global warming, but my part is so little… my not getting the bench will make no difference in reducing global warming, I say to myself, and nothing I can do will change capitalism and the politicians that allow it to continue unabated. This is the essence of rationalism.

But it’s also something else: It’s inagency. I believe — I would say “I know” — that nothing I can do, not joining groups, or going to rallies, or writing letters to elected officials, or standing on a corner ranting, of posting on social media, or witing a Medium column, will cause anything at all to change. That knowledge of the truth is what drives inagency. It causes inagency. The forces causing inagency, like the ones causing climate collapse, can no longer be stopped. It’s like a self-reinforcing loop.

That’s inagency in action.

--

--

N. R. Staff
Novorerum

Retired. Writing since 1958. After a career writing and editing for others, I'm now doing my own thing. Worried about the destruction of the natural world.