In a related sense, deontologies are essential because we can only apply consequentialism to a limited degree.
Deontologies are just approximations of consequentialism. Just some attempt at formalising heuristics. They’re useful but sticking to deontologies over when it goes against pragmatism seems to not make sense (especially when you take this perspective).
If we can keep the reasons it was put into place top of mind, we can re-assess those reasons in modern times. And I suspect we’d find those reasons compelling.
Most people are too busy with their lives to worry about the reasons for heuristics. Having them in place as superstitions makes sense so that everyone can use it without having to reason out why they’re there. The people who want to reason about it always have the option to dig deeper and see why it was put in place.
Suggesting that superstitions are rational because they may improve outcomes strikes me as the “argumentum ad consequentiam” fallacy. Yes, superstitions may sometimes produce good outcomes — but that doesn’t mean they are rational.
This is missing Taleb’s point. Which is that “It may be rational to believe in untrue things, or things with scant evidence if the result is what you want it to be.”. As no claim about the truth value of the belief is made, it’s not really argumentum ad consequentiam.
If people know why they should do something, they’ll be more judicious about it — when to keep doing it, when to stop doing it, when to adjust the rule/method, how to advocate for it, how to adjust it based on reasonable critiques.
This seems to be where you disagree with Taleb (and me). His claim (and it has considerable backing from fields like “bounded rationality”) is that people have finite mental resources to be rational, and are faced with intractable optimisation problems in life. If they had to think through reasons for every action, then they’d be virtually paralysed while taking decisions. Thus most take short-cuts whenever they do not see a considerable possible downside to their action. If these short cuts are drawn from long surviving traditions, then the survival of these traditions is a certificate of the short-cut being viable (in the ancestral environment at least). If your short cut is based on more recent arguments that have looked one or two steps forward, then you run the risk of unforeseen consequences. (That’s essentially a much less eloquent summary of Taleb).
A concrete (and hopefully uncontroversial) example here would be the crippling effect of debt. It was taboo (and is still in many cultures) for people, especially the young to have debt. Understanding reasons for this needs a fair amount of maturity, that the young usually don’t have. When this taboo went away, young people in places like the USA went into debt early in life, and essentially made their life much harder.
If a superstition is a scalar, a rational position is a vector. The latter contains essential information for moving through time and space. While the former may provide utility in the short-term, it lacks the critical information necessary to handle changing times and changing circumstances.
Let me add to your analogy. All people only have limited memory. Thus they store vectors only for the stuff that are more likely to have to reason about and take both decisions in similar circumstances. For everything else, they just store the scalar. If people are interested in full information (and a small fraction of the people are), they can read up about it.