I encounter lots of people who don’t or can’t do what they said they would do.
By this I don’t mean they mess up once in a while. I’m talking more like people whose predictions and commitments get broken on the regular, 20–80% of the time.
My response to these people used to be something like “Gain some self control, please.” If that didn’t happen, then after a couple more times getting burned counting on them, I’d write them off and distance myself.
More recently, I’ve become a bit less of an asshole, and started making a different sort of request. Now it’s more like “Learn to model yourself, please.” I vastly prefer making plans with a friend who says “Just so you know, I’m pretty unreliable and I’ve got like a 60% chance of bailing on you” than with a friend who says “I’ll be there!” and then cancels (though I prefer an Actually Reliable person to either).
I posit that there’s a weird sort of symmetry between these two different strategies. They remind me of prophecies — one kind is accurate because you’re a prophet, and you actually understand the mysteries of the universe. The other kind is accurate because you’re a king, and you can make it come true.
What’s interesting is that they complement each other. You can be a completely reliable person with high enough stats in either:
The main insight I draw from this is that you can patch holes in one with the other. If you’re trying to level up in reliability generally, you may have been considering only one strategy viable, like people who try really hard (with sheer willpower) to make themselves get out of bed and go to the things they’ve committed to, even though it’s eroding their souls.
Turns out, you can just switch to the task of improving your self-predictive models. And with all the energy you save by not pushing yourself around, you can probably afford to invest a significant chunk of attention into just noticing what sort of algorithm you are — what are the reliable triggers of a bad day, the reliable heralds of attention shifts. Some people are more complex and harder to pin down than others (for instance, you might have a health problem that can flare up unexpectedly), but in general the data is there, and I have yet to see someone who couldn’t marginally improve at predicting themselves. And for the die-hard lifelong rationalists who know that they’re 60% unreliable because they’ve actually checked, there’s always room to practice the skill of discipline.
As a vaguely tangential side note of diachronic propaganda, I want to end with the following modification of the old Mark Twain quote:
If you self-model well enough, you don’t have to remember anything.