Mental Models are Tools not Glasses

Rafael Velásquez
Decisions and Perceptions
6 min readMay 24, 2017

Or why you may want to LBO St. Peter’s Cathedral; or not.

Introduction

Mental models are the way we are able to simplify and think about our infinitely complex world, as Borges wrote in Funes the Memorious:

“To think, is to forget differences.”

There are two general ways that people use mental models, the glasses approach and the tools approach. In the glasses approach, people have one big idea (similar to Isaiah Berlin’s hedgehog) which then tints their view of the whole world, a sort of Grand Unified Theory that explains everything. The tools approach rather claims ignorance over the actual machinations of the world and instead simply tries to better understand its different parts individually, applying the best tool for the task.

In the following, I’ll go through an example of applying the tools approach and why there really is no such thing as a Grand Unified Theory of Everything.

LBOing St. Peter’s Cathedral

Imagine kidnapping George Soros, Warren Buffett, and Henry Kravis and locking them up in a room. While waiting for the security teams of these three billionaires to break in SWAT style, you ask them about a potential investment you want to make in a company that specializes in the selling of indulgences. You would quickly realize how differently these three investing giants look at the world and analyze investments.

Buffett might quickly decide it falls outside his Circle of Competence (as according to Berkshire-Hathaway groupies he has never sinned) and then ask whether you, kind kidnapper, can let him out as he is late for a game of bridge with Bill Gates.

Circle of Competence

Kravis might take a look at the financial statements, see some quality, consistent cashflow (doesn’t get better than tithes), unnecessary overhead costs (why do all company locations need mosaic windows anyway?) and a pristine balance sheet. In other words, an ideal LBO candidate. He’d quickly call his investment bankers and set up a loan for 7–8X EBITDA (better to avoid questions about the correct amortization rate of intangibles such as the Holy Spirit, or whether double decline depreciation is appropriate for the Rome campus) and make a deal.

Finally, Soros might see that same store sales are dropping and spot a classic reflexive situation. Given the messianic nature of the enterprise a drop in subscriptions will lead to a negative feedback loop of ever fewer customers. Combined with management’s total disregard of the Human Uncertainty Principle (given their insistence on the infallibility of the CEO) it’s looking like a surefire loser. He’d immediately go short and try to add “The Man Who Broke the Church of St. Peter’s” to his resume (directly above a mention of that skirmish with the Brits).

Rashomon Effect

In analyst lingo, our investors have just handed out Neutral, Buy, and Sell ratings to the same company. So who’s right?

To answer that question the only reasonable approach is to begin by recounting an obscure Japanese movie from the 1950s. In the movie Rashomon, four witnesses of a murder are summoned to testify in court. Once there, they all report the same facts surrounding the murder but tell significantly different stories of what happened. Leo Breiman introduced the term Rashomon Effect into statistical parlance to describe situations where different statistical models have similar predictive accuracy but different interpretations.

Imagine trying to model changes in Exxon’s share price from a dataset of 30 variables. To keep it simple you decide to create a regression that only includes five. Turns out there are 140,000 possible combinations, of which many will have similar performance. Confronted with these models of possibly contradicting interpretations what is an analyst to do?

He should start by picking the most convenient and simplest model given the specific situation. For example, certain variables may be much cheaper or faster to obtain. Besides just accuracy there are other tests that may make some models better than others. If the cost of under-predicting is higher than over-predicting then you would be well served in picking a model that emphasizes this. In other words you should create a sort of meta model to help you pick. By doing this you are replacing the search for the absolute ‘Truth’ to the search of the most useful truth (and least biased by emotions and desires). Plato perhaps puts it best in the Cratylus:

“Then the actions also are done according to their proper nature, and not according to our opinion of them? In cutting, for example, we do not cut as we please, and with any chance instrument; but we cut with the proper instrument only, and according to the natural process of cutting…”

Tools not Glasses

To answer our question, all three investors were right. Based on their own circumstance, strengths, and weakness they made the right decisions for their portfolios. They picked the model that best fit their own situation, applied it and got a result. You, now having a better understanding about the investment possibilities of the business thanks to the wisdom of these three giants can now go ahead and make the best decision for your own circumstances.

This is why mental models or theories need to be used more like tools than like glasses. When you put on a pair of glasses, they put their own tint on everything you see, scratches and smudges included. Tools on the other hand are variable, when you have a task you go to your toolbox and pick the most appropriate one.

It’s easy to fall in the glasses trap. Certain disciplines (especially in introductory classes or texts) are dominated by a single idea that can later be used to seemingly explain everything. The ‘Efficient Market Hypothesis’ is a great example of this. Its proponents argue endlessly that the market cannot be beat despite ample evidence to the contrary and apply it in situations which they shouldn’t, as Buffett remarked in his 1988 letter to shareholders:

“Observing correctly that the market was frequently efficient, they went on to conclude incorrectly that it was always efficient. The difference between these propositions is night and day.”

Charlie Munger, likens this approach to having a mental latticework from which you hang all your models (and can see all of them) and you mentally go through a checklist to try and pick the best model for the situation.

Latticework

Conclusion

The trick as Munger argues, is to ‘build’ and use many of these models and not get emotionally attached to any of them (you wouldn’t get attached to an Allen wrench would you?). The more models you have the better you will be able to fit any situation you come across and the better you will understand the world as it really is. Most importantly perhaps, you’ll be able to recognize the way other people are thinking about a situation and why the may or may not understand something. Simply put, if something important happens that does not fit your mental models you are likely to get defensive about the situation. Understanding this, you can now spot it and know when you are out of your element. Your next step should be to go back to your toolbox and look for a different model that better fits the situation not continue trying to put a square peg in a round hole. And if there is no appropriate instrument in your toolbox then make a trip to the hardware store, or as I like to call it the book store.

--

--

Rafael Velásquez
Decisions and Perceptions

Trying to master the art of investing and lead the Good Life