Sugar (2008), dirs. Anna Boden and Ryan Fleck. DP: Andrij Parekh. HBO Films

The Autumn of Ideas

An education in math, data science, and the American ethos prompts a hard realization: gradually understanding your place in a global distribution

I read A Mathematician’s Apology before I understood what real mathematics was. G.H. Hardy’s 1940 treatise is in part an explanation of the beauty of pure math: not the computational, grade-school drudgery that springs to mind for many, but a creative discipline centered around devising proofs, uncovering patterns, generalizing known objects to more complex conceptions — and not necessarily in the service of any immediate application. To demonstrate, Hardy cites two important results, usually attributed to the ancient Greeks: that the sequence of prime numbers is infinite, and that the square root of 2 cannot be written as the division of two integers, known today as a rational number.

The second statement is especially interesting. Imagine a unit square, with all sides of length 1. By the Pythagorean theorem, its diagonal has a length of √2, or about 1.414. With none of our contemporary knowledge, it seems very reasonable to have surmised that all partial quantities between the integers could be attained through the operation of division; after all, you can see a length that is some “fraction” between 1 and 2. The idea that the rational numbers are somehow not complete — that you can choose a subset of rationals that gets closer and closer to √2 without ever reaching a least upper bound in those rationals — is quite a modern notion that was just formalized in the nineteenth century!

But the construction of the real number field is certainly not an exciting narrative for everyone, and indeed, Hardy offers his two example theorems as a kind of litmus test: “a reader who cannot appreciate them,” he pointedly writes, “is unlikely to appreciate anything in mathematics.” It is a rather harsh statement from a man often asserting such unmitigated declarations, but to Hardy’s credit, it is a bluntness he consistently applies throughout his essay: to the idea of natural talent, to vocational choices, and ultimately, turned inward to himself. For besides being a spirited justification of math as art, A Mathematician’s Apology is also, strikingly, a melancholy acceptance at old age of the loss of one’s creative faculties.

That permanent loss haunts Hardy’s essay. In expressions that math is a “young man’s game,” that youth’s “first duty” is to be highly ambitious, he makes it clear that his calling, the subject he exalts so passionately, is no longer something he can even engage in; his prime years of creativity passed long before the sixties from which he was writing.

“It is plain now,” he confides, “that my life, for what it is worth, is finished, and that nothing I can do can perceptibly increase or diminish its value.”

For all of Hardy’s attempts to convince a lay audience of math’s aesthetic pleasure, any reader is more likely to finish Apology with a greater empathy for Hardy himself and his naked reckoning with decline. The essay transforms from a math disquisition into a sobering memento mori.

The issue of whether Hardy’s capitulation was premature or sensible immediately arises, but it is a difficult one to address. These are questions we constantly ask ourselves — “Do I have the competence and energy to pursue something further? What can I do to make my life meaningful?” — but the stakes when posing them in later years are nothing less than the final testament of how you’ve conducted your life. Hardy is a logical thinker by trade, but maybe his surrender is born of an idiosyncratic austerity. To claim so generally that math is a “young man’s game” seems too acquiescent, as if there might be true boundaries constraining the fertility of the mind.

I would more readily think of sports as laying claim to a title of “young person’s game.” (Is it not the flesh that’s weak, and the spirit that’s willing?) The physical demands on the professional athlete’s body are tremendous and relentless: muscle fibers stretch under exertion; break, rest, rebuild; emerge stronger, until eventually they don’t — not in the same way, with the same ease. And biologically, ineluctable things are happening anyway, regardless of athletic strain. Cells age. Something about dwindling telomeres at the ends of chromosomes. Le Temps mange la vie. So there is a fear even more troubling, more piercingly existential, in any proposition intimating that activities of the mind might also be the young person’s exclusive dominion.

The counterexamples will undoubtedly abound — the long list of men and women who have hewn their masterpieces out of the block of their latter years — but it behooves us not to dismiss Hardy’s severity so easily, and instead to seriously contend with his life-spanning perspective. Are only certain careers a young person’s game, or does much of life fall into that category? And is it ever prudent to concede defeat in the lifelong quest to matter?

I have returned to A Mathematician’s Apology more than a decade after my first experience with it. And that elapsed interval coincided with the revealing decade that was my twenties, when notions of invincibility dissolved, when certain personal limitations were made manifest — when the finitude of the mortal coil started to make a little bit of sense. There is a universality in Hardy’s resignation, a grieving over lost abilities, and he approaches it with a straightforward stoicism. I, on the other hand, ruminate in a much more sensitive manner, and I grieve not over grand abilities lost, but over abilities that perhaps never even matched ambitions.


I currently make a living as a data scientist. Different people have their own definition of what that is exactly, and in the age of dev bootcamps and massive open online courses and an autodidact’s Stack Overflow search results, there is no standard certification of such an unusual title. Many data scientists hold doctorates in computer science; some have master’s degrees in statistics or electrical engineering; some could be self-aggrandizing business analysts (I could be an imposter, too). So while academic departments at universities like NYU, Columbia, Stanford, and Berkeley have begun to codify, at least at the terminal master’s level, what comprises this interdisciplinary field, I will say simply that data science is a rich combination of computer science, statistics, and applied math. It is the thing that powers Siri’s natural language understanding, Google Photos’ object recognition, Tesla’s self-driving cars; its rapid advancement presages an epochal shift in the very nature of human labor like another Industrial Revolution. Economist Hal Varian of Google famously declared in 2009 that the statistician would be the sexiest job of the current decade. Perhaps “data scientist” was just shy of becoming a mainstream term at the time, but in the realm of big data, the statistician he called out was most likely the data scientist we know today.

I had zero plans — and definitely no prescience about where the digital world was headed — to position myself so conveniently amid job sexiness; I came about the career in a pretty indirect way. When I graduated from high school, my original plan was to become a high school English teacher. I loved literature, the way that words flow and soar and come together symphonically, and I thought that teaching English would be the best way to discuss literature for a living.

But in that last year of high school, a microeconomics class also got tangled in the unpruned overgrowth of my attentions. The formal modeling of concepts that I recognized from my everyday behavior (Opportunity cost! Diminishing marginal utility!) was my first exposure to how the inexhaustible toolbox of applied math could uncover the regularity of systems in the world. And when I finally got to college, I admittedly got swept up by a trend: countless undergrads prior to the Lehman Brothers collapse wanted to become investment bankers and management consultants, because that’s where the money was. You could get a six-figure salary right out of college; Goldman Sachs and the Boston Consulting Group shone with the patina of prestige in the way that Harvard did for universities. So economics was our most popular major, a direct feeder into those careers; the on-campus recruitment closely matched the blind demand; and my fellow econ majors and I fulfilled the role, to use William Deresiewicz’s term, of excellent sheep.

To separate myself from the banking novitiates in the same department, I made it a point to always opt for the heavily quantitative courses. For what is economics, analysis at the margin, without calculus? But I went all Icarus about it, overzealously enrolling in graduate courses — and, ultimately, in what I now know to be real mathematics. Grown-ass math. “If you don’t get this, you are unlikely to appreciate anything in math” math.

I was extraordinarily unprepared. I’d been so good at that computational, grade-school drudgery, calculating convergent series, integrating functions by hand — trigonometric substitution here, integration by parts there. Freshman linear algebra was a turning point, because it required a high abstraction to which I’d had no exposure. So instead of appreciating the structure of a vector space — an object that would later turn out to be absolutely central to the data work I do today — I was trying to gather a mass of theorems in my head with no mental framework of actual comprehension binding them together.

This naïve approach to mathematics continued to unravel rapidly for me. To keep afloat in my analysis class (“analysis,” in this case, being a very generic name for the mathematical study of functions), I had to shallowly memorize definitions like

A topological space is compact if each of its open covers has a finite subcover —

upon realizing that I didn’t know what open and cover were. Of course I’d then have to go down the branching trail of definitions and look up limit point, and neighborhood, and ball, and eventually drown in theory that my brain just could not unify. The start of the trail, the “compactness” of a space, was just a heterogeneous tangle of words to me, signifying some elegant property beyond my comprehension.

The hubris still didn’t end there. I also tried out abstract algebra, the branch of math that takes the fundamental objects of the drudgery — numbers and operations themselves — and generalizes even those to higher structures like groups, rings, and fields. The class was filled with mathematicians cozily at home with abstract perception, fluent in the art of the proof. My tangle of words devolved further into random phonemes. I dropped the course after failing the first midterm. I had reached the least upper bound on my abilities.


It’s a strange thing to wander and come up against your own mediocrity. Joseph Heller parodied a famous line from Shakespeare to quip, “Some men are born mediocre, some men achieve mediocrity, and some men have mediocrity thrust upon them.” The idea of working hard to achieve your own personal level of mediocrity sounds in line with the absurdity of Catch-22, but that second case is a sharp description for a bucket I fall squarely in.

Before a reader dismisses this as some kind of coy self-deprecation, I warn you that I profit little from false modesty. This economy and this culture reward the brash, the overconfident; I do myself a disservice to minimize my stature on the job market. My self-evaluation, rather, is very similar to the one logically deduced by Hardy, the man who affirmed that “most people can do nothing at all well” — the key difference, again, being that I have confronted my intrinsic limitations at half his age, and without the fecundity of his prime years. I want to do so many things but am not good at most of them. I have intellectual interests that will never have time to be cultivated. I cannot be a world-class expert in all of them, and in fact, will probably not be a world-class expert in any of them. As the curmudgeon further explains:

It is a tiny minority who can do anything really well, and the number of men who can do two things well is negligible. If a man has any genuine talent, he should be ready to make almost any sacrifice in order to cultivate it to the full.

The point ends with one of the more positive comments coming out of Hardy, and it’s I who am now compelled to cast the imposing shadow of reality: it is a stroke of luck if that genuine talent, already improbable, also lines up with the market.

I did turn all this into a career, I suppose. I was clearly never going to be a pure mathematician, but I picked up databases and data infrastructure on the job, and I molded those skills and my working knowledge of statistics into the trappings of a data scientist.

I’m proud of what I have done in this role, and I think I’m good at it, but my original training has also been superseded twice now: first by “traditional” machine learning, which is still not part of the standard methodologies of all economists today, and then by the deep learning techniques currently in vogue. While it is incumbent on anyone in the tech industry to continue his or her education to stay competitive, and I dutifully comply—I often feel like I’m practically switching careers when things change so seismically. I can see how my skills transfer, but I don’t think everyone in the industry necessarily does. If data science is a combination of computer science, statistics, and applied math, then the constant innovations make me feel my shortcomings three times over — shortcomings as a software engineer (which I’m not), as a theoretical statistician (Remember college?), and as a savvy practitioner of quantitative methods. This is having your own mediocrity thrust upon you.

Rather than enjoying the nourishment of continuing education, I instead feel the ponderous weight of time to escape insignificance. A Ph.D. could close my gap in formal training and compensate for the mistakes of my youthful impetuosity, but it would also bring me close to my forties, and in all likelihood it would be a misguided gambit for legitimacy. Furthermore, there are much younger people—the ghost of Hardy returns!—who can play this game with absolute ease while I struggle with the rudiments. The almost inevitable conclusion is that I’m just not in Hardy’s “tiny minority” who can do this “really well” (emphasis his); I muse about spending several arduous years of catch-up when my current state is roughly representative of what I’m capable of. Everyone has some homeostatic condition, some self-sustaining equilibrium where certain constants are maintained and life moves forward; and whether I like it or not, for a while I’ve been converging to my stationary point.


So then why do these conclusions, Hardy’s and mine, sound so fatalistic, so acquiescent? Why do Hardy’s barbs about innate talent grate on the ears?

I think, specifically, that they grate on American ears. It seems antithetical to the American spirit (the old-fashioned, Mr. Smith Goes to Washington kind) to think that the greatest success is reserved for the naturally talented, and that the mediocre must submit to cold realities — that there could be boundaries on anyone, really. That American spirit might also shun the proposition that youth presents the one fleeting opportunity to capitalize on talent, flaring intensely, then woefully decaying — because we seem to think that hard work, by anyone, in any circumstance, is enough to get a person anywhere. Hard work is a sufficient condition for unbounded mobility. If some of area of knowledge excites me, I should be able to make a life out of it. I’ll have to work hard, but the goal should be attainable in a lifetime.

In giving his jocular take on the dire problem of income inequality, British comedian John Oliver proclaimed his genuine admiration for this brand of unbridled American idealism: we acknowledge systemic problems that put the vast majority of citizens at an economic disadvantage, but we still think it could all work out for us. He perfectly summarizes the stance: “I can clearly see this game is rigged, which is what’s gonna make it so sweet when I win this thing!”

Leo McCarey’s 1935 film Ruggles of Red Gap further draws out the American view of mobility in contrast to its British alternative when Ruggles, an English butler played by Charles Laughton, moves to the United States under new masters, Egbert and Effie. He initially adheres to the hidebound class system he knows, refusing to sit at the same table when Egbert invites him to a drink because they belong to different castes. But as Ruggles grows accustomed to his new surroundings, he embraces his independence and eventually decides to start his own enterprise, breaking a lineage of service roles. When organizing his new restaurant, he triumphantly reflects that he has become a “person of importance, however small. A man whose decisions and whose future are in his own hands.”

This ethos, whether or not it is steeped in credulity, is inextricably embedded into the American fabric, in the majesty of Whitman, the self-reliance of the Transcendentalists; in our myths of the Old West; in the “Formation” of Beyoncé and the “American Oxygen” of Rihanna. I cite Rihanna with sincerity, as a listener who found himself unexpectedly and deeply moved by her commanding lyric, “We sweat for a nickel and a dime / Turn it into an empire,” and by the simple tableau inscribed by “Young girl, hustling / On the other side of the ocean.” By the time she returns to this construction,

Young boy, hustling
Trying to get the wheels in motion
But he can be anything at all
In America, America

I am honestly trying to see myself, and I wonder whether to submit to a legitimate cynicism and resolve that these American narratives are treacherously misleading, or whether to obstinately maintain the indomitable optimism that no other country can wield.

It is a magnificent worldview, but it is at odds with the plight of everyone trying to get the wheels in motion today, mired in shameful inequity. It somehow becomes necessary to compose the also-ran’s and the dilettante’s apology, and the apology of those with modest beginnings and faith in the lottery—an attempt to justify the average person’s manifest destiny to find happiness in a fair system and achieve a level of independence.


The experience of gradually realizing your personal limitations is so affecting and so humbling that I had to give it a name. I formalize it as a reusable structure of high generality and complexity: I’ve personally been calling it the Sugar Effect, after the 2008 film Sugar by Anna Boden and Ryan Fleck.

Sugar tells the story of a Miguel “Sugar” Santos, a promising baseball player in the Dominican Republic who gets recruited to play ball in the United States. Because of the language barrier, social setbacks, socioeconomic differences, loneliness, and most tellingly, a level of performance that faces its first hard limitations on the field amid a global pool of talent, with its neverending pipeline of upstarts — Sugar cannot really make it even in the minor leagues. In the end, he abandons the professional arena that was his ticket to America, still continuing to play baseball at a local, less competitive level that fits him, but starting some life not predicated on the talent that ostensibly distinguished him.

I don’t know if other people read Sugar the same way I do. But its narrative has stuck with me for a long time. I still remember the film’s final shot, pulling focus on the baseball diamond’s chain-link fence, and I recall Antoine Doinel in The 400 Blows, clawing to freedom but then instantaneously facing the double-edged sword of autonomy — a future wholly unmoored and unguided, the terrifying white of the tabula rasa.

This motivates the following definition.

Definition 1.1. A personal transformation is called the Sugar Effect if you severely readjust your understanding of yourself after having been placed in a larger sample of people, with a distribution closer to global reality.

I hide behind the formality of a definition because the plain words are too frank. The Sugar Effect is realizing you don’t measure up, and you have to rethink your life. Maybe that’s surrendering, or maybe that’s just adulthood.

Mike Birbiglia’s new film Don’t Think Twice depicts the Sugar Effect superbly in the artistic world, where a troupe of improv actors all long for writing and acting gigs on a genericized Saturday Night Live. The hallowed variety show is one of the dreams of the aspiring comedian, but the difficult reality of competition with people just as ambitious and talented (or worse, not as ambitious and much more talented) dashes the hopes of most aspirants in these highly subjective endeavors, too. Moreover, there are strict gatekeepers in these entrenched institutions, and they are not necessarily meritocratic. One actor in the ensemble concludes, “Your thirties are about realizing how dumb it was to hope,” while another eventually discovers genuine fulfillment at the troupe level, just as Sugar perceives his fit in community baseball.

The best outcome after going through the Sugar Effect is to grow in self-knowledge from the reevaluation it unpleasantly forces. Break, rest, rebuild, emerge stronger. I feel this momentous transition very poignantly in Baudelaire, the originator of “Le Temps mange la vie”—Time eats life. Also in his thirties when “The Enemy” was published in The Flowers of Evil, he extends a metaphor describing the storm that was his youth, laying his garden to waste. “Voilà que j’ai touché l’automne des idées”:

Now I have reached the autumn of ideas
And I must use the shovel and the rake
In order to regather the inundated earth
Where water digs holes as big as graves.

“The autumn of ideas” is a forlorn phrase, but radiantly beautiful, too, like the marigolds and crimsons of fallen leaves. The regathering is there, as is the memento mori, ever present.