You Know What Sucks? Or: a rant about research, immortality, and the Ham Sandwich theorem.
You know what sucks? Even if by some miracle, all this Ph.D. research stuff pays off, if by some miracle I discover or invent something so important people call it “King’s Theory” or “King’s Conjecture” or something like that, it will almost certainly not be taught to undergrads. All the easy stuff is taken. How I long to have a L’Hopital’s rule, a Kregg’s cycle, a Lazlo’s Hierarchy: something that most educated people are at least familiar with. Something people have heard of and would think, “yeah, I feel like they talked about that in Freshman chemistry”. Something like that.
Any sort of reasonable impact anyone can make now is so dense and abstruse that even professors of the subject would have to really read your thesis closely to understand it. And maybe that’s just a result of my field. I study neural networks, which are already not discussed in freshman computer science courses. Even worse, I study their application to graph data structures, something I only learned was a concept in my sophomore year of college. Granted, these are concepts that generally a high school student could understand, but I long for my work to torture future middle schoolers. Is that so much to ask?
I feel like we’ve reached a point where any reasonable contribution to science takes 4 years of study just to grasp, and maybe an additional year or two to understand why it’s important. For example one of the most seminal papers I can think of in my field from the last ten or so years is “Attention is All You Need” by Vaswani et al., 2019. This paper lays the groundwork for the language models that people online right now are abuzz about. You know, stuff like ChatGPT, Google Translate, even Replika. These are all products that use this work, but understanding Vaswani’s contributions to AI would take years of study. I still don’t even grasp it fully (at least the why part of how it works, I get the math and stuff). Even then, I don’t think there will be a “Vaswani’s Principle” talked about in hushed tones between frantic freshmen fretting over finals.
Similarly, in pure mathematics, we have the AKS primality test. This is an algorithm discovered (invented?) by Agrawal, Kayal, and Saxena to determine if a number is prime in polynomial time. I know what you’re thinking, “What the f**k does that mean?” and that’s my point! It’s super useful and important, but knowing why requires at least a cursory understanding of computational complexity, analysis, and number theory. And that’s just to know why it’s important. Forget applying it! Even older seminal works, like Abel’s proof that there is no general equation to solve quintic polynomials or Gaios' proof of the necessary conditions to solve any degree of polynomials¹, are understandable in what they’re solving, but understanding how these ideas were reached is impossible without years of pure math study.
To further illustrate this, let’s look at a sample of the most important research of the last year. Now, it’s not easy to quantify “the most important paper” from 2022; it’s highly variable between different disciplines, sub-disciplines, who you ask, etc.. It’s also very difficult to know what’s important because it just hasn’t been cited yet. The review cycle for any decent journal is about 6 months to get soft-accepted, another month for a rebuttal and a rewrite, and finally, in a few months, it’s actually published (hopefully in a conference with a venue near the beach). To then cite that work requires hundreds of other people to also write a paper (at least 6 months if it’s any good) and sit through their own review cycles and so on. So any paper in 2022, no matter how seminal, would be lucky to have any serious citations by 2023. Additionally, some of the most important work from 2022 may not have even been published yet! Suppose for example that last December, your lab just achieved fusion ignition in a breakthrough for clean energy². Sure you can write a short paper, or a letter to a journal in anticipation of a larger paper³, but it’s far better to just do a press briefing to get the news out, and wait a year to get published in one of the top journals.
So maybe 2022 will be a little tricky to pin down, at least for right now. No problem. Let’s check out the most cited papers of the past few years. Maybe we can see what the folks at Nature have been up to the past few- oh wait⁴. Well. It’s a fluke that we can understand that one. The next most cited papers (omitting COVID-related ones) from the ’20s are about: the NumPy python library⁵; a neural net that predicts protein structures; something called “the mutational constraint spectrum”; and another paper about neural nets predicting protein structures. All cool papers, but all completely opaque in their meaning without deep prior knowledge of biology (and surprisingly a lot of AI).
But maybe COVID took the wind out of a lot of other discoveries’ sails. Maybe looking at a predominantly biology journal was a poor sample of the understandable-ness of modern science. I barely passed high school bio, so maybe I’m not the best person to ask about how important or understandable these works are. So let’s instead turn to a metric I think most people can agree is valid for scientific importance: prizes. Let’s go down the list, shall we?
- Nobel prize in Physics: “for experiments with entangled photons, establishing the violation of Bell inequalities”.
- Nobel prize in Chemistry: “for the development of click chemistry and bioorthogonal chemistry”.
- Nobel prize in Biology: “for…discoveries concerning the genomes of extinct hominins and human evolution”.⁶
- Turing Award (Computing): For pioneering contributions to numerical algorithms and libraries that enabled high-performance computational software to keep pace with exponential hardware improvements for over four decades.
- Field’s Medal (Math): For bringing the ideas of Hodge theory to combinatorics, the proof of the Dowling–Wilson conjecture for geometric lattices, the proof of the Heron–Rota–Welsh conjecture for matroids, the development of the theory of Lorentzian polynomials, and the proof of the strong Mason conjecture.⁷
And, uh, no offense, but I’m going to skip the humanities. As an avid reader of trash literature and science fiction, I don’t need to tell you that I wouldn’t understand the work of the Nobel laureate for Literature.
But honestly, did you understand any of those? Did any grouping of words in those descriptions really make sense? Perhaps I have especially educated readers who can school me on whatever “orthogonal chemistry” is, but, with all due respect, I doubt it.⁸ So what is one to do? How is a researcher to attain immortality?
One stumbling block to this end is the decrease in the viability of what’s called “small” or “table-top” science. That is to say, experiments that can be done without billions of dollars of R&D investment by an individual at home. Over the years, this kind of science has been dwindling. To be sure, it’s still quite possible to do individual, essentially free research in fields like math, literature, or philosophy — fields where the majority of grant money goes to pay for overworked grad students' coffee bills. But you’d be hard-pressed to make a discovery in a more physically oriented science without the aid of a supercollider for example. This reliance on a large team of researchers limits one’s ability to make individually attributed discoveries by definition, but I think it speaks to a larger stumbling block for scientists chasing immortality: the obvious stuff is already taken. The days of beautiful math proofs that make perfect sense as soon as you see the answer may be gone. The days of taking careful notes on your pea pods to discover heredity, of tossing cannon balls from tall buildings to learn about gravity, the days where you could do something ordinary, and simply write down what happened, these days seem long gone.
I feel most jealous of the tabletop scientists of the 18th and 19th centuries. Take Robert Goddard for example. The father of rocketry, a name known by even the most dabbling of nerds. The Robert H. Goddard papers state that “he had in his spare time,…developed the mathematics which allowed him to calculate the position and velocity of a rocket in vertical flight” [emphasis added]. Speaking of developing important ideas in your spare time, let’s talk Newton. In, A Short Account of the History of Mathematics, Ball states, that he “distinctly advance[d] every branch of mathematics then studied”. Newton invented the field of calculus at the age of 23 in his free time, in his “private studies at home”. Now, look. I’m not saying Newton was some sort of dummy. Far from it. What I’m getting at is that before the 20th century, we lived in an age of individual discovery. There was a time when a single individual could put their head down, crank out some numbers, and stumble upon something so seminal that it creates an entire field of mathematics.
Can any modern grad student hope to do the same?
Perhaps I’m being too pessimistic. I’m the apocryphal guy in the modern art museum, looking at a Pollock, or a Mondrian, or a Rothko who says, “I could have made this piece of crap”. His sage friend looks at him, and says, “But, you didn’t”. Maybe so. But is there anything left that I could have made? Is there anything simple yet undiscovered? Perhaps when some new field comes along with the aid of those fancy supercolliders, it will break open a whole new field of mathematics that no one thought to use. Maybe there will be a new gold rush for discoveries, allowing me to find my “King’s Inequality” that will make a generation of grad students quake in their boots. Maybe.
But if not, I worry that by making the sciences inaccessible if not for massive funding we will lose out on discoveries even that will not be named for an individual. If advancing a field costs billions, then its results must be valued similarly. So we’ll lose out on new iterations of the Ham Sandwich Theorem⁹, or follow-up works to Sendler, 2017 (p. 69–73)¹⁰, or the minimum number of sudoku clues required for a valid puzzle.¹¹ All papers that are interesting in a theoretical sense, but probably not worth billions of dollars of funding. And sure, these papers were all inexpensive, but what about the ones that aren’t? What if there’s some solution to an open question that’s accessible with billions of dollars of funding, but the results just aren’t that valuable outside of theory? It will just remain unsolved.
It seems to me that as individual discovery is curtailed and experimental results become harder to interpret, we also incentivize more of the same. More papers will be impossible to reproduce without enough money to run a new experiment, fewer scientists will be known, and science will be harder to understand. I don’t claim that this is a bad thing, necessarily, but it’s certainly disheartening.
So, I’ll just coin it now. King’s theory: as time passes, the people for whom the theories, theorems, proofs, and principles are named, that high schoolers are forced to learn, will have been dead for longer and longer periods of time. This will be the final named theory taught outside of grad school, as all others simple enough to grasp have already been discovered.
- Just the first 3 are solvable in case you’re curious — and I guess x⁰ if you’re being really pedantic
- You know, like the Lawrence Livermore National Lab did in December of 2022
- See, for example, Abu-Shawareb, H., et al. “Lawson criterion for ignition exceeded in an inertial fusion experiment.” Physical Review Letters 129.7 (2022): 075001, the precursor to the actual fusion breakthrough that is as of this writing still unpublished. This letter only received around 50 citations so far, but it was published about 6 months ago.
- Zhou, Peng, et al. “A pneumonia outbreak associated with a new coronavirus of probable bat origin.” Nature 579.7798 (2020): 270–273. Currently sitting at a cool 20,000 citations in 2 years.
- Which has been out since 1995, so I’m not sure why there’s a paper about it from 2021, and in Nature no less?
- Notably, this was one by one guy: Svante Pääblo. Kind of strange considering most of these are only attainable if you have a huge team of researchers behind you. In fact, a while ago they had to implement a rule stating that only 3 people could be co-awarded the prize because the teams were growing so large. I’m really not sure how they determine who gets it. If it were me, I wouldn’t bow out with grace and defference to the superiors on my team. I’d be fighting tooth and nail for it even if I was a deep middle-author. But that’s just me.
- There were actually 4 Fields medals awarded in 2022, this was just the most jargony one. The others were for “solving longstanding problems in the probabilistic theory of phase transitions in statistical physics”, “advances in the understanding of the structure of prime numbers and in Diophantine approximation”, and — my personal favorite — “proof that the E8 lattice provides the densest packing of identical spheres in 8 dimensions”
- To the good readers of Medium, this was originally posted on a personal blog. Perhaps one of you can indeed tell me all about orthogonal chemistry. I excitedly await your comments.
- For any collection of n-dimensional objects, it is always possible to divide them each in half with a single (n-1)-dimensional cut. Not unlike the ham dividing the two pieces of bread in a sandwich for the 3-dimensional case. Of course, in the 2-dimensional case, this is known as the “pancake theorem”.
- “Similar Mechanisms of Traumatic Rectal Injuries in Patients who had Anal Sex with Animals to Those Who Were Butt-fisted by Human Sexual Partners”, Journal of Forensic and Legal Medicine.
- McGuire, Gary, Bastian Tugemann, and Gilles Civario. “There is no 16-clue Sudoku: Solving the Sudoku minimum number of clues problem via hitting set enumeration.” Experimental Mathematics 23.2 (2014): 190–217.