Closing Disclaimer

m
mindlevelup
Published in
4 min readSep 5, 2017

After reading through all of this rationality stuff, you might be feeling very excited to go out and try some of this stuff out. You might think about imposing this rationality stuff upon many areas of your life.

And for that, I have a major cautionary warning.

Rationality can be dangerous.

Rationality can be dangerous because it affects your ontology, the way in which you see the world.

Here’s an analogy:

Owen is a person who wants to get work done, but often finds himself playing video games. He also feels bad about doing so because it doesn’t fit in with his self-image. Maybe there’s also something here about society has shaped his values, but the actual root cause isn’t that important. The main point is that some part of his gut endorses playing video games.

So he’s following his intuitive feelings, but also there’s guilt somewhere in the system.

Now let’s say he bumps into rationality — planning, habits, motivation — the whole package.

Rationality is a system. It’s a way of looking at things.

It’s kind of like a set of special glasses.

Once Owen puts on these glasses, he starts to see new opportunities to use his shiny new techniques, like TAPs to try and remove his video-game-playing habit. But note that the very idea of techniques, of using concepts, is only something he sees when he puts on his Rationality Glasses.

My worry is when people use rationality as the lens through which they view the world for so long that they forget that there’s something important that’s hidden underneath the Rationality Glasses layer. Owen might just end up thinking that the Rationality Glasses show how the world is, rather than merely a useful way of looking at the world.

And rationality, or at least the way that I’ve presented it here, will have its own ideological biases. This isn’t necessarily bad; it’s a necessary consequence of any way of looking at the world. There’ll always be implicit values for any system you choose to use.

For rationality, these values are about highly striving for things like Optimization and Self-Improvement. I worry that this implicit valuation can be taken in a very wrong way. I see a failure mode where everything that doesn’t directly contribute to Optimization is seen as a “bad” thing which needs to be removed.

Owen might then see his video-game-habit as something foreign and “bad” rather than a poorly understood part of himself. So when Owen tries to use rationality to forcibly remove those “disobedient” parts of system, I think something quite terrible is very happening because he’s smothering vital parts of himself.

Just because those parts of yourself conflict with your “stated” values doesn’t mean they’re wrong. It’s important to recognize that many apparently “bad” parts of yourself also have good intentions.

After all, these were parts of himself that Owen had listened to prior to encountering rationality. They might be hidden underneath the Rationality Glasses, but they’re still important.

It’s a little like if I just handed you an instruction manual for the human mind, but with a bunch of the pages missing.

There’s a lot that you’d now know about how things in the mind work, but if you just follow those directions, you wouldn’t get the whole story. There will be functions and knobs that’ll also be important which you wouldn’t know about. If you only follow the manual and don’t trust your own sense of what else is critical, then you’re in trouble.

Sooner or later, something will break, and troubleshooting will be very, very difficult.

This is why I think it’s important to respect those “useless” (from the Rationality Glasses POV) parts of yourself.

Because they’re still a part of you.

The best solution, as far as I can tell, is something like being able to take off the Rationality Glasses to get in touch with those gut, instinctual, and quiet parts of yourself. You need to be able to step away from the rationality virtues of Optimization and just accept all the parts of yourself.

When you stop trying to cut off or suppress different parts of yourself, something very different happens.

You get a shift where you…Just Do Things.

Motivation and willpower, for example, end up just seeming like largely incoherent words. You’ll have different tastes, but suddenly the question of “How can I force myself to do/not-do X?” just becomes irrelevant.

When you start integrating those quiet parts of yourself, you’re somehow more in-control, even thought you’re incorporating more dissent. I know it sounds stupid and Buddhist. But there’s something very good that’s happening here, I think, when you allow all parts of yourself to be fulfilled.

I think that a more sophisticated theory of rationality might look a lot more spiritual from the outside, focusing on ways to integrate and dialogue between the explicit parts of yourself (which the Rationality Glasses endorse) and the implicit parts of yourself (which might not be endorsed, but are important nonetheless).

So I want to stress that rationality as presented here is incomplete. And when taken to the very extremes, it can be downright harmful. Be wary of forgetting the stuff that’s beneath the glasses.

Rationality is just a tool. Take from it what you need, but don’t let it control what you take in the first place.

A grave warning, I know.

There is a silver lining, though.

The plot twist goes something like this:

When you do take off those Rationality Glasses, it turns out that you can see even more clearly without them.

--

--