Like most lawyers, I’m on Twitter a lot and, like most lawyers, I spend most of my time tweeting in an effort to seem funny. (To say the responses are “mixed” would be to assume that I ever get responses). But occasionally, something will crop up that demands a response more nuanced than 280 characters and goes into a little more depth than a gif thread. This morning, I saw one, and I want to discuss the ethics of it with you. Last day of the year in the worst year on record? Perfect day to do some philosophy!
Today’s issue is about ethics largely in the sense that we’re going to take a measured look at something controversial and try to work out how to approach it. Specifically, I want to talk about this:
So our friends from Boston Dynamics are back, and not only does the now-infamous yellow dog make an appearance, so do two humanoid robots and some kind of…wheely…giraffe…thing. It’s self-referential, it gets that people are scared of the yellow dogs, it tries to defuse some of those concerns, and sort of says “we get that 2020 was rough, but this might make you smile.” Not bad marketing, that. But as I said, I want to take look at this from an ethical perspective and give a measured analysis. Are you ready? Okay here goes:
I hate this.
Just A Couple of Bro-Bots Dancing
That’s not quite what I mean to say — more precisely, I want to express that this is deeply unethical and I hate it. Why is it unethical? Let’s think about this a few ways. The first, and easiest way, is to think cynically, which would be to say that this is unethical because it is an effort (and a transparent one at that) to convince people that BD’s products are harmless and pose no threat to life or limb because they can dance. “Robots are fun! They’re here to do work for us and entertain us, so let’s not be concerned and let’s please everyone stop talking about Black Mirror.”
Yes, the cynical view is probably right (at least in part), but that’s not what makes this video so problematic, in my view. The real issue is that what you’re seeing is a visual lie. The robots are not dancing, even though it looks like they are. And that’s a big problem.
Humans dance for all kinds of reasons. We dance because we’re happy or angry, we dance to be part of a community or we do it by ourselves, we dance as part of elaborate rituals or because Bruce Springsteen held out a hand to us at a concert. Dancing, in fact, is one of the things that humans have in common across cultures, geographies, and time — we love to dance, and whenever we do it, it’s because we are taking part in an activity we understand to have some kind of meaning, even if we don’t know what it is. Perhaps that’s the point, how can we even explain dancing? As Isadora Duncan once said, “If I could tell you what it meant there would be no point in dancing it.”
Robots, though? Robots don’t dance. That’s not some sort of critique of a robot or shade-throwing. I don’t criticize my hammer for not being able to recite Seamus Heaney. Tools serve functions and move in the ways designed or concocted for them — but they have no innerworldly life that swells and expresses itself in dancing. We might like to anthropomorphize them, imbue them with humanness largely because we do that to everything. We talk to our toasters and cut deals with our cars (“Just make it ten more miles!”) because we relate to a world filled with things made by humans as though that world was filled with humans, or at least things with a little humanity. And so when we watch the video, we see robots moving in a way that we sometimes do or wish we could, we experience the music, the rhythmic motion, the human-like gestures, and they all combine to give us an impression of joyfulness, exuberance, and idea that we should love them, now that they can dance.
But they can’t.
The Man Behind the Curtain
No, robots don’t dance: they carry out the very precise movements that their — exceedingly clever — programmers design to move in a way that humans will perceive as dancing. It is a simulacrum, a trompe l’oeil, a conjurer’s trick. And it works not because of something inherent in the machinery, but because of something inherent in ours: our ever-present capacity for finding the familiar. It looks like human dancing, except it’s an utterly meaningless act, stripped of any social, cultural, historical, or religious context, and carried out as a humblebrag show of technological might. Also: the robots are terrible at doing the Mashed Potato.
Consider, though, that these robots carry out minutiae to make the illusion more real, fans whirring and telemetry readings off the charts. These robots were commanded to move in exactly the way the programmer wanted to get exactly the outcome the programmer wanted. This is the technical equivalent of Yosemite Sam screaming “Dance, Varmit” and shooting at Bugs’s feet. And in a real way, the dog and the giraffey-thing are less problematic — they’re very clearly not-human, and so the programming there might have fewer nasty connotations. But those two humanoid robots dancing around? Let’s unpack it a little.
Just Like Us?
Why do people design humanoid robots? It’s not like humans have the most efficient structure to get work done, that’s literally the reason why we invented robots. Why then? If we’re being generous, it could be that we want something that has that sense of familiarity we want, and that isn’t so nervewracking to interact with. If we’re being realists, it’s because we want something that looks like a human to boss around. I guess the way to tell which view is right is to check up on the primary use cases for human-like robots and that’ll make clear whether we’re being exploitative or not. I’ll let you go look up how humanoid robots are primarily used right now.
The moment we get high-functioning, human-like robots we sexualize them or force them to move in ways that we think are entertaining, or both. And this is where the ethics become so crucial. We don’t owe a robot human rights; they aren’t human, and we should really be spending our time figuring out how to make sure that humans have human rights. But when we allow, celebrate, and laugh at things like this Boston Dynamics video, we’re tacitly approving a view of the world where domination and control over pseudo-humans becomes increasingly hard to distinguish from the same desire for domination and control over actual humans.
Any ethical framework would tell you this is troubling. You don’t need to know your consequentialism from your deontology to understand that cultivating and promoting a view of the world where “things that are human-like but less human than I am get to be used however I want” will be a problem. History, not philosophy, gives us all of the examples we need to understand why. This is the conversation we need to be having — and some really excellent scholars, writers, engineers, and programmers are.
But we need to get to a consensus about this kind of problem because it is not going away. Boston Dynamics or other firms will continue to produce this kind of content designed to evoke the “right” kind of response, all the while embedding and normalizing a view of humanity’s relationship to technological devices that undermines the things that make us human. I’m not saying they’re necessarily doing it intentionally: it doesn’t matter. It’s going to happen, just like a programmed high kick from a robot. If we don’t bring thoughtfulness and care to this discussion instead of glib jokes about Skynet, it will be too late to infuse ethics into how we create and treat things that will look increasingly like us. And if so, when that particular mirror is held up to us, we may not like what we see.