What’s with “Intellectual Fashions” Anyway?

Bringing them the plague
Rebooting the trivium
8 min readJun 18, 2021

--

As scholars, we are imbued with the ethos of seeking out the truth. Either for itself, or for the benefits which “facing up to reality in the cold hard light of day” generally confers.

This search, on most views, is considered to involve a fearlessness and independence from the passing fashions that come and go in the world outside of the academy’s hallowed walls.

So, it can seem strange that intellectual fashions are “a thing”. Yet a thing they are.

Twenty years ago, people in the humanities were all reading Jacques Derrida. “Deconstructing” texts was touted as the finally-uncovered promise of enlightenment (let me ironize). But today, as far as I can see, Derrida mostly languishes on book-shelves; one week in courses on literary or cultural theories or on the history of 20th century European ideas at most.

Again, for a while in the 1990s, “postmodernism” was “vibing”, as the kids say. We were all aflutter about infinite signifiers floating, well, infinitely, an irony that had become blank, metanarratives that had become defunct, and how everything doesn’t quite work as anyone intended. But that was cool.

Today, however, postmodernism enjoys an afterlife only in the enraged denunciations of Right-eous culture warriors … It is all very strange.

Fittin’ in?

Any explanation of this phenomenon must surely begin with the observation that intellectuals are, like everyone else, social animals. It is natural to like to fit in, and not rock the boat.

Celebrations of bold individual innovators aside (and endless corporate promises of innovations all the time), it is also not easy to be a Plato, a Francis Bacon, an Isaac Newton, or an Albert Einstein. It is far easier, both socially and intellectually, to contribute one’s tiny brick (or mortar) to the walls of commentaries that shore up the bases of such giants’ statues.

The philosopher Friedrich Nietzsche was surely right to rail against disciples. Yet, today commentaries on his work form an enthusiastic subindustry, attracting hundreds of publications by “free spirits” a year.

Perhaps I am being old-fashioned, but it seems to me to behoove seekers of truth to try to become aware of their own biases, as well as all of the social, psychological, political and institutional factors that might be shaping their researches.

After all, if we scholars are social animals, many of us study how not all social arrangements are conducive to group-members thinking independently, and reshaping their views faced with uncomfortable evidences, the possible disapproval of authorities or their fellows.

It isn’t hard to recognize that seeking out what is true is not the same thing as accepting what “we” desire, or want others to think about us, and our actions. Very often, these things can be in conflict.

So, if there are intellectual fashions, like there are fashions in haircuts, clothing, and music styles, isn’t this worth thinking about?

Not fittin’ in?

Yes, but to even ask about this is to betray one’s intellectual calling, or to side ‘objectively’ with our myriad media attackers in the age of the culture wars”, someone will say. I’m not sure this is a helpful response.

For it seems to me that to be an intellectual ought not to just be part of what sociologist Max Weber called a “status group” — although it surely is that, as myriad adjuncts around the world, aching to get a start, can certify. And if we intellectuals do not abide by the Delphic-Socratic know thyself, who can?

From Plato and Aristotle onwards, the search for philosophical truth was aligned to a charting of the sources of our errors, and the “refutation of sophisms”. Our best-laid knowledge-seeking plans, these philosophers saw, are always set to go astray.

To prevent this, we need to understand how and why we so often (perhaps always) incline to believe things we find appealing, pleasing, flattering, convenient, easy …, rather than things that might be true in any larger or independent sense.

We didn’t need the aforementioned Nietzsche to know that the truth can also be uncomfortable, challenging, even upsetting, at least in the beginning. The Roman Epicurean Lucretius tells us his poetry is honey to hide the initial harshness of the doctrines he has to convey.

But it is the intellectual’s task to seek the truth nevertheless. Otherwise, the differences between an intellectual and an artist, creating different visions of the world, would be an illusion.

An early modern thing?

It is intriguing that one of the most fruitful periods wherein a sophisticated literature (treatises, thesises, even satires) dedicated to analyzing “scholars behaving badly” was the early modern period (16th through 18th centuries). This is the period wherein the modern sciences were born, and European understandings of the natural world were almost completely overhauled.

Reading these texts, and commentaries on them, like Sari Kivisto’s Vices Of Leaning: Morality and Knowledge at the Early Modern Universities is a bit uncanny, if not frankly unsettling. For it suggests that, for all of the developments that have reshaped the world and the universities in the last centuries, a great deal has stayed the same.

One would like to think that we scholars have had done with disputatiousness and the pettifogging mean-spiritedness of specialists, each of jealously staking out and defending our tiny patch of the intellectual garden.

One would like to think that intellectuals would have worked out at some point recently that obscure language and jargon is usually less a sign of profundity than a prestige play to exclude outsiders.

One would like to think that intellectuals have overcome the misanthropic tendency to take our intellects to be testimony to our moral or larger superiority over others.

One would hope that intellectuals, wedded to truth, had “put aside all others”, including the envy, insecurity, and willingness to engage in slander, misrepresentation, and straw-manning to better our rivals that the early modern texts suggest were common in that period.

But would listening to too many conversations between intellectuals bear this out?

Capital climbs?

The early modern texts on “vices of the mind” are very clear about an unavoidable tension that is set up, wherever the pursuit of knowledge is institutionalized. What is this tension?

Knowledge in itself is not a scarce thing. In economic terms, it’s more like a public good. Your having it doesn’t diminish the amount available for me. You might teach me, for example, and take joy in sharing. As might I.

Yet, the expressions of knowledge can be objectified, and as such, subject to different forms of institutional commodification and control. Knowledge is written down in books which are not available, or legible, to everybody. It is taught in university courses which are not open to all comers.

A new discovery can be patented. Its discovers may receive financial capital, through grants or royalties. Then there is what sociologists call symbolic or cultural capital: aka forms of prestige.

Last but not least, there is the prestige conferred by holding a position and bearing a title, whereby one is able to teach and be taken seriously by other scholars. This also is clearly subject to competition and, today as much as ever, widespread scarcity.

Not what you know?

Once knowledge becomes a means to the achievement of scarce vocational, financial and symbolic capital, it becomes subject to forms of competition.

Incentives can then emerge which push people to ideas that may not be of the most lasting value, in themselves or for society — but which promise a “raise” in the prestige scale, and perhaps also on the wage scale.

Incentives may emerge, in short, which push people towards winning in the prestige game, being seen to be knowledgeable or prestigious, rather than necessarily being learned or wise.

Followin’ the fashions?

And at this point, following an intellectual fashion, rather than thinking for oneself, can seem like a smart play.

As a longtime commentator myself (je m’accuse), I can certify that from an intellectual perspective it is easier to commentate on others’ work, than to strike out on your own. Would it were not so!

There is also a place for exacting commentaries, especially written by younger scholars, still learning their craft.

But now think about all of this from from a symbolic capital perspective.

Criticizing a presently-prestigious thinker or doctrine in the professional world is a high-risk move. One might get catapulted rapidly up the prestige scale. You might present yourself, and be accepted, as an enfant terrible. But this is rare, and that path is laden with the skeletons of many a broken pride ...

For a start, you need to be able to guarantee a hearing for your criticisms. But if you are lower on the present prestige scale, and the thinker you are criticizing a good deal higher, nothing guarantees this. (They have already earned their way, after all!)

Moreover, if the peer reviewers of your critical bombshell are commentators on the prestigious thinker or doctrine, they may be unlikely to let the article make publication — unless, as we all ought, they separate the intellectual from the prestige dimension of the exercise.

Or consider that presently-prestigious thinkers and their ideas (what people refer to as “intellectual fashions”) will have become at some point in the last two decades established fixtures in many syllabi, taught to new generations of undergrads. Established academics, perhaps even your teachers, will have made careers from being commentators on their work.

Again, then, all sorts of incentives exist for established commentators to perceive any independent attempt to criticize “their” thinker as a potential challenge to the basis of their material and symbolic capital. No one wants to have been shown to have bet on a bad horse:

Can anyone expect that he should be made to confess, that what he taught his scholars thirty years ago was all error and mistake; and that he sold them hard words and ignorance at a very dear rate. What probabilities, I say, are sufficient to prevail in such a case? … All the arguments that can be used will be as little able to prevail, as the wind did with the traveler to part with his cloak, which he held only the faster. (John Locke, Essay on Human Understanding, IV, xx, 11)

So, risks exist here too, especially for young scholars, in breaking the molds, all rhetoric about scholarly independence aside.

Finally, consider how this can play out in our present age of citation counting (the global system whereby scholars are ranked by how many citations by other scholars their work has garnered).

This system actually provides disincentives for striking out into some new area that others haven’t even glimpsed. New content = no citations, unless one has already established one’s name already.

It is a much more savvy move in the prestige game to contribute one more commentary, perhaps with some politely deferential criticisms, of established figures.

And yet?

Once you place these more concrete, institutional incentives in the mix, and add them to each of our social longing to “fit in”, then intellectual fashions, schools, and endless sectarianism all don’t seem sociologically remarkable at all. It would be more remarkable if intellectual fashions did not exist, or found themselves on the decline.

And yet …

--

--

Bringing them the plague
Rebooting the trivium

Freud's response on arrival in America, some Camus; blogs and essays (older style) on philosophy, psychology, culture and politics.