Stephen Casper, email@example.com
Euconoclastic blog series
Could any finite number of people experiencing something slightly unpleasant — such as getting a speck of dust in their eye — ever outweigh one person being tortured for 50 years? Most people would say no.
But Eliezer Yudkowsky says yes, and gives what I call a continuous commensurability argument: He asks whether some large number X (you could even suppose it’s 10) of people being tortured for 49.999 years would be worse than a single person being tortured for 50 years. Almost all of us would say yes. Then he asks us whether X^2 people being tortured for 49.998 years would be worse than X people being tortured for 49.999 years. Again, almost everyone would say yes. And we can go on inductively, multiplying the number of people by X and decreasing the amount of pain they experience by a small amount at each step until we are left concluding that some X to a very large power of people experiencing some slight discomfort is the worst possibility we’ve considered yet. The intensity of pain is on a continuous scale and X^n is unbounded. Unless we start selling arbitrary principles about the incomparability of pains above and below certain arbitrary thresholds, Yudkowsky’s conclusion is clear.
But very few people agree with him, and many hear his argument and still reject the conclusion.
Torture for 50 years is awful — unimaginably awful. But do you know what else is unimaginable? The number 10^50000. The human brain isn’t even capable of grasping differences between hundreds and thousands, and we don’t even come close to comprehending the enormity of the number of particles in the universe — around 10^86. Forget about using intuition here.
When people reject Yudkowsky’s argument, they might A) mention thresholds for incommensurability in the pain scale (sometimes ad hoc ones), B) invoke egalitarian theories, C) shrug their shoulders, or D) bring up their theology. I don’t think any of these hold water, but I won’t get into those now (I talk about all of these in other posts).
Occasionally though, people might respond by saying that the continuous commensurability argument makes sense, but who cares if we have contradictory moral preferences? And why should we care so much about hypotheticals that are so unlike any situation that would take place in the real world?
I also think this response is problematic.
Sure, we don’t have to care. But I think it’s only reasonable to say that more consistent, less contradictory belief systems are better than other ones by some substantial standard. Imagine if this type of argument were made in math or science — and those fields are just about knowing things, not about doing what is actually right. A scientist doesn’t have to care if a testable hypothesis of her theory is shown to be false. A mathematician doesn’t have to care if a lemma for his pet theory is disproven. But how ridiculous would it be for them to claim that those theories are good ones?
Why should morality be exempt from lawfulness? If we stop caring about rules and consistency in a belief system, it ceases to be meaningful. The “Who cares about consistency?” approach throws out objectivity. If you let me suppose that 2+2=3, then I could prove literally anything I wanted in arithmetic math (to arbitrary precision) with enough expansions and substitutions. Nothing would have any semblance of right or wrong, and that undercuts why math is useful in the first place — because it makes sense and gives us sound, objective answers.
Why have any moral ideas at all if we don’t care about them making sense? We don’t have to care. But at that point, we shouldn’t pontificate about right and wrong, empathy, the value of life, the horrors of torture, or anything of the sort while saying Yudkowsky is wrong.
We need to take thoughtful conclusions seriously. This isn’t about how we feel. This isn’t about the whims of our sloppily-evolved brains. The world is real, and there are plenty of one-versus-the many moral dilemmas that play out in it every day which demand more thought than hip-shot intuition.
As Yudkowsky says, “Shut up and multiply.”