Bertrand Russell said that if he were to be remembered, it wouldn’t be for his many books and articles, nor his ardent anti-war campaigning. Rather it would be for a funny little mathematical paradox he discovered over 100 years ago: what happens when you try to construct the set of all sets that don’t contain themselves.
Similarly, if Richard Dawkins is remembered in 100 years I don’t think it will be for his ardent atheistic campaigning, nor for his championing of the contemporary neo-Darwinian synthesis. I think that it will be for a funny little section of his first book, The Selfish Gene, in which he defines “memes” and gives them their distinctive name. It’s a playful section that proposes something deeply serious and interesting: that the contents of culture can, at least on the face of it, be analyzed using evolutionary theory. Here’s Dawkins giving the original proposal:
We need a name for the new replicator, a noun which conveys the idea of a unit of cultural transmission, or a unit of imitation. ‘Mimeme’ comes from a suitable Greek root, but I want a monosyllable that sounds a bit like ‘gene’. I hope my classicist friends will forgive me if I abbreviate mimeme to meme. If it is any consolation, it could alternatively be thought of as being related to ‘memory’, or to the French word même. It should be pronounced to rhyme with ‘cream’.
Examples of memes are tunes, ideas, catch-phrases, clothes fashions, ways of making pots or of building arches. Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by leaping from brain to brain…
Since the original proposal by Dawkins, there have been immense difficulties in turning “memetics” into a science. No one has come forward with a solid proposal about how culture actually works via the principles of variation, heredity, and differential reproduction.
As an undergraduate in college I did an independent study with a professor trying to resolve this issue. Together we worked to evolve an idea across classrooms, presenting a text definition and then allowing people to contribute to it. We watched it change, like on Wikipedia, as people added and took away based on what sections they thought best answered a particular question. In the final analysis it didn’t work. Elements of the text had changed, absolutely, but had they evolved? What was a unit of selection in a text?
Despite these scientific difficulties, on the cultural and colloquial side of the spectrum, Dawkins’ idea has been embraced. “Meme” is now in the Merriam-Webster dictionary, used in common language, and everywhere on the internet. Surely, the 2014 viral ice-bucket challenge is the most explicitly meme-like thing imaginable: There was a method of heredity (the challenge to others), clear mutational variation (how different people did it), and differential reproduction (which challenges were turned down).
Yet while Dawkins will be remembered in 100 years for his contribution, it’s not quite true that he himself invented memes. The Selfish Gene was published in 1976. Even just pulling from my own bookshelf I find a book originally published in 1972 by one Gregory Bateson, an intellectual cult hit called Steps to an Ecology of Mind. The back cover asks, in bold yellow lettering: “Is there some sort of natural selection which allows one idea to live, and another to die?”
I don’t mean to imply Dawkins stole anything; his formulation of the idea of cultural evolution is by far the best, and also most falsifiable. Ultimately I think Dawkins won out because the eggcorn name that Dawkins coined, “meme,” works so well, with its purposeful aural suggestion of gene and its latinate playfulness. And because The Selfish Gene sold so well. Which would mean that whoever “invented” the notion of the meme is the one who created the most successful meme out of it.
When David Foster Wallace was asked whether he suffered from influences creeping into his fiction he responded by counter-asking, “Well, what’s wrong with that?” Artists as a whole are more flexible about these kinds of things, probably because in the arts the doing of the thing is the thing itself. There's no separation between form and content. Memetically, art is totally analog, perhaps like the first biological replicators were. There’s no digital unit of selection like in DNA, which means there's no memetic genotype/phenotype distinction in art, no code lying at the heart of the product that is being read out poorly or precisely. Art is all phenotype. So whether or not something counts as plagiarism in such a sphere is very much a question of whether or not the final artistic product succeeds.
In contrast, I think a lot of people assume there are strict rules governing intellectual property in the realms outside of the arts, like science, philosophy, and engineering. A patent is a pretty clear example of a digital genotype for a meme. From the outside it probably looks like academic disciplines adhere to a similarly strict genotype/phenotype distinction — that one could have a great idea and express it poorly, or a poor idea and express it convincingly, and either way still get credit. But that’s not really the case — the genotype of an idea isn’t easily extractable, or well-acknowledged, even in science.
When David Foster Wallace was asked whether he suffered from influences creeping into his fiction he responded by counter-asking, ‘Well, what’s wrong with that?’
There’s also the problem of cryptomnesia, a false attribution of originality to oneself, that happens all the time to academics. During my PhD I worked as part of a lab devoted to theory, developing mathematics and formalisms and new ideas. Eventually it became almost impossible to keep track of who had said what originally. Which isn’t a problem, except that science operates by authorship credits, and the stew of free-form ideas had to be transformed into an orderly byline.
Once during my PhD I was emailed a prepublication draft of a scientific paper that was up for review. The paper proposed a new measure of causation using information theory. The problem was that my co-authors and I had published a paper back in 2013 using an identical measure. We were using it to propose a definition of emergence, to show that the higher levels of a system (biology, psychology, etc.) can be more causally influential than the lower levels (microscopic physics). But that 2013 paper wasn’t even the origin of that particular measure of causation: We were grounding it and expanding it, not proposing it.
Originally the measure came from a paper in 2003 by one of my co-authors, but hadn’t been dwelt on very much there. Unbeknownst to us at the time of the 2013 publication, a similar metric was also rediscovered in a 2008 paper, which didn’t cite the original 2003 version. And then another 2012 paper did the same thing. After receiving the email I asked what we should do about this latest paper, who should we contact, how we should sort all this out. One of my co-authors, totally nonplussed, said, “Well… what’s the impact factor of the journal it’s being published in?”
Most scientific and philosophical papers are never cited, or cited only once or twice, and then often just by their original authors in further papers. The vast majority fade back into the sea of noise that makes up any field. It’s estimated now that the number of scientific papers published is doubling every nine years. It’s a synecdoche for the culture at large, where essays, articles, news, opinions, and stray typed thoughts all join to form one big endless sound. And this low howl, as long and deep as an ocean, is something that must be reckoned with if you want to do what you feel is original intellectual work.
For instance, consider Douglas Hofstadter’s latest nonfiction work (published all the way back in 2010), Surfaces and Essences: Analogy as the Fuel and Fire of Thinking. It proposes that analogies — in the broadest sense as the use of one idea to understand another — are the core of human cognition. A significant focus of the book is on how huge swaths of our language turn out to be analogical without our even noticing it. This is precisely the hypothesis, and line of reasoning, of George Lakoff’s 1987 tome Women, Fire, and Dangerous Things: What Categories Reveal About the Mind. Indeed, the two books cover the same ground and have very similar messages. Yet Lakoff is cited only once by Hofstadter, and for a minor empirical psychological study. (Don’t worry though, Lakoff’s book itself has about 27,000 other citations.)
In Surfaces and Essences, Hofstadter also argues that science is a succession of analogies (a tower of metaphorical mappings), but doesn’t bother citing Thomas Kuhn’s The Structure of Scientific Revolution — which is notorious for arguing this point. By comparison, Lakoff, in his section about the central role of metaphor/analogy in scientific thought, thoroughly cites Kuhn.
How much of that is just Hofstadter ignoring previous research, and how much of it is an unavoidable consequence of trying to do original work atop a mountain of others that you will inevitably miss? William James, in the Principles of Psychology from 1890, directly prefigures the work and signature ideas of the three most famous contemporary philosophers of mind: Thomas Nagel, David Chalmers, and Ned Block. Every single one of the big ideas these authors are currently known for were, if one thinks in terms of a memetic genotype, directly debated in James’ times. They have been updated, given new phenotypes for the modern age, but the root substance is the same. And even in James’ time, the problems were rephrasings from Gottfried Wilhelm Leibniz’s time.
Ideas just become part of the cultural atmosphere and then are picked up and championed. Even the more wild ones. The central thesis of physicist Max Tegmark’s best-selling 2014 book, Our Mathematical Universe, is that:
we all live in a gigantic mathematical object — one that’s more elaborate than a dodecahedron, and probably also more complex than objects with intimidating names such as Calabi-Yau manifolds, tensor bundles and Hilbert spaces, which appear in today’s most advanced physics theories. Everything in our world is purely mathematical — including you.
Yet you can find the same wild idea in a Stephen Baxter science fiction short story, published in a 1997 collection called Vacuum Diagrams:
Perhaps we too are creatures of mathematics, self-conscious observers within a greater Platonic formalism, islands of awareness in a sea of logic…
And of course that small germ of an idea can presumably be traced back further and further, perhaps to Plato himself.
These are just a small sample of the connections I’ve found of a larger set contained merely in my own library, my own tiny sliver of all that has been thought and said. What of the literature and movies in our culture that constantly mirror one another, nest each other inside endlessly, like Russian dolls tracing themselves back? Consider the 2018 blockbuster movie Annihilation, which was adapted from the Southern Reach Trilogy books published in 2014, which are a clear retelling of Tarkovsky’s 1979 film Stalker, which in turn is based on the 1972 novel Roadside Picnic? How many more such transitions from book to movie to book before the idea is bled dry? It’s enough to make one cry out unhappily that only repetition remains, and bemoan:
Would that I had words that are unknown, utterances and sayings in a new language, that hath not yet passed away, and without that which hath been said repeatedly — not an utterance that hath grown stale, what the ancestors have already said.
Those words on the wish to be able to say something new and original are 4,000 years old. They are from Khekheperre-Sonbu, during the reign of Senusret II. If an ancient Egyptian author can despair at his late arrival to culture, what hope have we?
At the same time, this points to the perennial nature of this complaint, and the obvious counterargument that if each generation has had this concern the world cannot be coming to an end, as original scholarship, intellectual work, and artistic production continue to be done. But there’s something different about our era, the information age. Quantitatively the volume of the information produced at every second of human civilization has exponentially increased, yes, but that’s just a change in volume.
The real change is qualitative: Information has become immortal. It has slipped the surly bonds of time. It will never pass into nothingness. The culture used to forget, remember? Books went out of print, people died and left no trace, entire civilizations were scuffed out. Information, held in physical forms, was degradable and irrecoverable. And even if retained, it wasn’t immediately searchable. But no longer.
Information has become immortal. It has slipped the surly bonds of time. It will never pass into nothingness. The culture used to forget, remember?
Most of what is written or filmed or recorded will be archived in a platonic and eternal digital realm, all now forever accessible. So with 7 billion commentators and thinkers and uploaders and texters, humanity is constructing an ever-expanding Borges’ library of recorded thoughts, ideas, opinions, sayings, beliefs. If the distinction between genotype and phenotype is held strict in intellectual life, eventually the minimal genotypes of everything will be said, and a record of it being said will be kept permanently available.
Maybe the slow cumulative effect on our culture, its fundamental weariness and skepticism at originality, will be huge. Or maybe at the societal level this will never matter much, as we, in the summer of each generation, still manage to reissue unknowing reboots of the classics.
The alternative is that we need to accept a breakdown of the genotype/phenotype distinction for ideas, maybe even for science itself. Let the phenotypes speak for themselves and not worry about attribution. Yet even if this acceptance grows widespread, for the conscious individual this new digital landscape can still be a paralyzing anxiety of influence. This essay probably already exists somewhere, parceled out among all the other essays elbowing for space in an infinite ocean of pages. And lo! During the writing of it I found a blog post that also tried to find the earliest mention of memes (turns out it’s Pliny), and provides a litany of earlier sources than Dawkins.
All this makes original intellectual work, and the feelings that go along with it, difficult to find space for. I wish to walk ahead into the pristine snow and forge a fresh path, but it fell days ago and already it is tracked over and patted down and dirtied by the boots of those who came before me.
I do not know why trampled snow troubles me so.