Task Generality, or, Sex with Robots

A Reverie for 2018–2019

grothendieckprime
Hardy-Littlewood
8 min readMay 5, 2022

--

“Ah jeez, am I going to have to have sex with a robot?” So rang one of the most striking questions I’ve ever been asked. It was 2018, and I’d just caught up with a friend and revealed that I was working at an artificial intelligence company. To the inquiry, I responded: “this is the first question you have about AI?Sure, this was the era when there were articles circulating in the New York Times about the ethics of android sex, and we had all had the friend who was trying to force us to watch Ex Machina and Westworld, but why was this on the front of people’s (and directors’) minds? I can’t believe it took me years to reach an answer: this panic happened because we were already, metaphorically, having sex with robots.

While there’s a sexy topic here, I also want to talk broadly about the contemporaneous fear (one to which I personally subscribed) of looming mass unemployment. They are related, of course — we know that the sublimity of mass production seduces us. This most recent anxiety that the machines would become sentient was spurred by a bunch of breakthroughs at Stanford, OpenAI, and Berkeley that convinced a bunch of people that we were going to see what’s called task generality. In short, AI was going to evolve from poorly tagging your Facebook photos, overcome Wittgenstein’s fable about how a sign cannot tell you how to read a sign, and become able to infer in the world what it ought to be doing.

The answer to the sexual anxiety comes, I think, a fortriori from the answer to the economic anxiety —in a poetic process of enantiodromia, we were turning ourselves into the robots. The economic-anxiety answer is a bit obvious in retrospect, but it led me nicely to the sexual-anxiety case.

Jesus, those couple years were a craze. I got into weird philosophical arguments the point of which now totally eludes me. The pandemic abruptly took over our attention, and I think during the last two years AI has begun to see serious (read: boring) adoption. Many companies have dropped the .ai domain name and have moved onto new key phrases (Industry 4.0, etc.). It is now hopefully widely known that most artificial intelligence systems are very task-specific and error prone: we know it when we see uncanny-valley kinds of perfomance classifying emails as spam or telling us that we have to listen to curiously popular videos. It’s not clear philosophically how it would become anything more.

But in 2018, though, people were terrified that automation was going to fire everyone or do, well, something horrible. I think Jacobin ran a piece about how programmers were going to unwittingly obviate their own jobs. Andrew Yang ran for President on a UBI platform and I gave him money. Software engineers in San Francisco started social clubs to discuss how we might be living in a simulation or how to stop AI from becoming sentient. Everyone was already scared about a bubble of some sort — we talked about how the Uber and Lyft IPOs were going to “eat San Francisco alive” — and there was this Nietzschean sense of chasing some tragedy that would redeem our dizzy professional lives. Remember Rococo’s Basilisk?

To take seriously the very unpopular worldview of psychoanalysis, I interpret the constant fear of something terrible as an unconscious and concerted yearning for a tragedy. If you remember how exactly we dizzy professionals lived, it’s not hard to imagine what exactly we hoped to end. It’s been discussed since.

Tedious as they are, there are some comments to be made about the education system, and about the structure of romance (if you dare to call it that) facilitated by dating apps, and about the blurring of all professional boundaries to create a transactional haze. Fortunately, a recent piece has put it better than I ever could. There’s a sense in which one could feel like literally every facet of life was being measured and priced by a market. I cannot tell you how many times I heard the phrase “social capital” used to refer to friend groups. All my friends watched that episode of “Black Mirror.” And this —the ability to measure and optimize one’s performance in every facet of life — was the task-generality we feared the robots would crush us at.

When I say “we,” of course, I do mean white-collar professionals in economic hubs. You, dear reader, are probably one of them. The origin and maladies of our professional managerial class are well-theorized. There are known portraits oversocialized, hypercompetitive, and risk-averse nervous professionals we churn out of Michigan and Penn and Berkeley every year — most of these kids have been on a track since elementary school and most of their friendships come from common pre-professional cohorts or the Internet. They become consultants and bankers and product managers, and spend very long days trying to polish slide decks for internal presentations and criticizing their peers’ choice of language. So am I. Ross Douthat complains routinely in the New York Times about the “overproduction of elites” hypothesis. American Affairs and Compact both run pieces about the outsize impact of this class of ours despite the constant anxiety of staying ahead.

To make at least one interpretation of the insufficiencies of the class lifestyle, allow a personal musing. I’ve had a couple of great moral blessings in life: among other things, I keep a close circle of friends from my high school who can speak directly to each other with constructive criticism and genuine curiosity, and I attended a serious liberal arts university where I was brought into discourse with peers who hold fundamentally different values from my own. There is very little space for this kind of socialization anymore, and most of the nudging of professional-managerial life takes one further away from these kinds of blessings.

(Grad school, by the way, doesn’t seem to offer a way out of this. Allan Bloom described the world of liberal democracy into which the graduate emerges as an “intellectual wasteland” back in the sixties, but it’s not clear that the university is so different now, either, with its hyperspecialization of the disciplines and vocationalization of scholarship. So many PhDs end up just building or selling software.)

All this to say, there are clearly some civic virtues that are on the wane within the white-collar world and the academy.

Structurally, there’s no faulting the corporation: firms are trying to manage people rather than discuss or resolve any concerns deeper than those logistical. Technology firms like Google of course have a particularly bad reputation. Marc Andreessen seems to think that Google’s new thought-police feature is going to spread across the technology stack. He may not be wrong. Tech workers themselves probably get real value out of features that make them look more generally acceptable. If you believe the class analysis, the professional managerial class is simply manufacturing weapons to extend its own interests. Not, however, to dwell on the Orwellian elements — radical outlets have said more and better than I need to.

As I mentioned, I think it’s broadly clear by now how AI (or, if you prefer, “The Algorithm”) task-specifically shapes most of the experiences curated for us. Instagram and TikTok, of course, are easy examples. We know that our favorite brunch place is designed to get you to post content. Go out to a gallery viewing in New York City (and God knows I hope that some software engineer who reads this actually does) and you’re likely to hear the curator make some remark about which piece was the most shared object in the collection. Of course the curator knows this. And anyone taking the job seriously should! (What, you want galleries that aren’t responsive to popular interest?) There are about a million social media marketing jobs in Manhattan and it’s not hard to hire a marketing professional to run social media analytics for your organization. So of course, you’re already being curated by a whole industry staffed by humans acting like robots.

To get even more specific about the algorithmic curation of experience, return to the dating apps for a few more tedious comments. Here we arrive at the sexual-anxiety case: of course we know the literal sense in which bots appear while you’re swiping (and how this causes women to write “I’m a bot so dhmu” as the entirety of their bios), and hopefully it’s less widely known what a profusion of pick-up artists hit YouTube trying to explain how to optimize your Tinder game to be “frame-perfect” (read: task-general). It’s terrible, let me tell you, to swipe through an app wondering if one’s profile is “good” only to hear advice one heard at work ring in one’s head: “you’re not going to get feedback from clients and the only definition of ‘good’ is whether or not you consistently get the job done so get past your ego and focus on the output.” Christ’s sake, one thinks, no one just hangs out in groups anymore because everyone’s time and social capital is so valuable, so I guess we’re all just clients.

So when the one of those guys from Citibank who dresses like Patrick Bateman at Halloween parties matches with a blonde social media marketing girl on Tinder?

There you have it. Each party has to have sex with a robot.

So unravels the riddle of a few years. It’s no huge revelation in retrospect, but looking back on a Leviathan like the fear of mass automation and of incels abusing androids, I can’t help but feel foolish and manipulable. These are not popular or marketable feelings, of course, and there’s still a part of me that lives in those days and winces to think how poorly-received such sentiment would have been. Now I’m ready to become one of those guys who lived through the dot-com boom and who only has shallow musings about “not getting your hopes up” to share with youngsters.

Ultimately, I do think the anxiety of 2018–2019 was just an understandable psychic backlash against too much of our own success. The fact is that American industry is such a juggernaut because we’ve invented managerial sciences and the discipline to ruthlessly measure and optimize outcomes. Frederick Taylor would be very proud. It is a triumph we shouldn’t gloss over that we can decouple the branding and marketing of an operation almost entirely from its productive or deliverable components. Despite the retrograde of our civic virtues, the professional managerial class is doing better than ever at tightening the labor market and demanding high pay and flexibility. Analyses like the above often beg for some grand answer about how we could live better or what was learned, but unfortunately I can only refer to the boring Aristotelian answer: seek moderation, even as you chase fame and fortune.

Seems like we found the tragedy we sought, though. We have run ourselves into a new spat of real global problems and perhaps something like protracted antagonism with Russia can provide something to focus on. Maybe Samuel Huntington was right — perhaps the broader economic order could stop expecting so much task generality? Who knows? At least living the world is more interesting now.

--

--