Mental Models of Financial Sabotage
In July 2008, I published my first article about trading and investing in a small industry magazine called SFO (for stocks, futures, and options). I had been working for five years on an idea based on three big areas of interest: trading success and failure, behavioral finance, and neuroscience.
My thesis was that “your brain wasn’t made to trade.” I came to this conclusion because of the intersecting knowledge from these three areas. First, I learned about trading as a clerk in Chicago’s pits, from becoming a professional currency trader myself in 1999, and from reading dozens of invaluable interviews with trading legends like George Soros and Paul Tudor Jones from Jack Schwager’s Market Wizards books.
My experience of trading, and of many other traders, gave me certain ideas about why so many failed in what was arguably one of the most mentally and emotionally-challenging “day jobs” around. The primary observations revolved around traders getting overly emotional and making bad decisions, driven by greed, fear, envy, regret, anger, despair, and immature ego/instant gratification needs.
I concluded that what was missing for most traders who let emotions override sound rules and discipline with money and risk was one of the top lessons from the Market Wizards: probability-based thinking. And since this was an area of weakness for me in high school and college, I decided to commit to teaching myself probability and statistics in my 30s.
What the Rogue Does to a Billion Dollars…
But one group that even probability training might not help were the so-called “rogue” traders, who could destroy a billion dollars faster than you could spell statistics. I collected and studied these stories, like Nick Leeson who took down one of the world’s oldest financial institutions in 1995, Barings Bank.
Then there were the “geniuses that failed” at Long-Term Capital Management, headed up by Nobel prize winners. These guys weren’t rogues, but what’s the difference when that $3.6 billion wipeout in global interest rate bets-gone-bad, during the financial crises of 1997 and 1998, saw the Federal Reserve step in and orchestrate one of the first “too big to fail” bailouts. Funny how that amount became peanuts only ten years later.
When I published my 2008 article, we had just witnessed the largest rouge trading debacle ever: French trader Jerome Kerviel cost Société Générale nearly $5 billion euros with his excessive risk-taking and covering-up of bad trades. (He says he did not work alone and that the culture of the bank was prone to excessive risk-taking, but that’s a good story for another time.)
I had two important conclusions regarding rogue traders. First, we have a lot to learn from them because, as you’ll see when we dive into the twin sciences of our irrationality, what the rogue trader can do to a billion dollars of OPM, we can do to our own investing and trading accounts. Second, I said rogues would always happen again, no matter how much regulation or oversight.
Six months later in December of 2008, we learned of Bernie Madoff and his giant ponzi scheme that hurt hundreds of investors from Wall Street to Main Street, and didn’t even spare Hollywood’s finest like Steven Spielberg and Zsa Zsa Gabor.
And in a further ironic twist, the magazine publisher for SFO where my article first appeared, was caught in his own ponzi scheme. (That’s one of the reasons I am trying to re-create it here since no online version is available.) Russell R. Wasendorf, Sr. was the former Chairman and Chief Executive Officer of Peregrine Financial Group, a commodity broker that filed for bankruptcy protection in Chicago in July 2012.
According to Wikipedia, “He was arrested in July 2012 following a suicide attempt. In September he pleaded guilty to embezzling $215.5 million from more than 13,000 customers over the course of 20 years. On January 31, 2013 he received a 50-year sentence for fraud, effectively a life sentence.”
The Big Bang of Intelligence
The second area of knowledge that informed my thesis that “your brain wasn’t made to trade” was brain science itself. As a psychology and philosophy major in college, and largely self-taught beyond that in the 1990s, I began reading a new cornucopia of lay person books by brain scientists and their story-tellers, like Steven Johnson in Mind Wide Open, who introduced me to the work of Antonio Damasio and Simon Baron-Cohen.
I was already intensely interested in evolution and had read nearly ten books by the late, great Stephen Jay Gould, paleontologist, evolutionary biologist, and historian of science at Harvard. I was always posing the question “What challenges in our evolution was this physical feature or behavior created by, or adapting to?”
When you start thinking about how our brains were “built” over millions of years you are delving in the arena of evolutionary psychology. My curiosity was especially sparked here by Carl Sagan in his book The Dragons of Eden because he was a fan of the “triune” brain theory espoused by neuroscientist Paul D. MacLean. This theory modeled the human brain thus, according to Wikipedia…
“The triune brain consists of the reptilian complex, the paleomammalian complex (limbic system), and the neomammalian complex (neocortex), viewed as structures sequentially added to the forebrain in the course of evolution.”
Even though many neuroscientists don’t agree with this model, it’s still useful in communicating the idea that our brains were indeed “built” by evolution over millions of years and we have “ancestors” and relatives you haven’t even imagined in the earth’s prehistoric food chain.
And my use of the triune model was also about remembering that different structures in our brains have different functions and evolution isn't’ always about smooth and gradual progress from one level to another, better level of function.
So, while we think we are of “one mind” evaluating life objectively and making decisions in rational ways, there are actually perceptual distortions, inaccurate judgments, and illogical interpretations being fed by lower, subconscious parts of our brains all day long. Thus, we have not one mind, but what I like to call a “multi-mind.”
In addition to all but killing off the triune model, the reason that non-scientists like me or Johnson could begin reading and writing so much more about the brain fifteen years ago was because a Renaissance was occurring in neuroscience research since the mid 1990s with the advent of sophisticated brain imaging equipment like fMRI scans.
Neuroscientists could literally watch people thinking and feeling as different parts of their brains would “light up.” And some of the most mind-opening experiments verified how unexpected parts of the brain such as emotional or memory centers were involved in the solving of math problems.
The third area that helped me conclude “your brain wasn’t made to trade” was formed by the work of psychologists and social scientists who didn’t bother with brain scans. Researchers like Daniel Kahneman, who won a Nobel prize for his work in 2002, were more interested in putting people in “decision situations” and observing how they acted and what they said and then studying the data from thousands of participants to find patterns of behavior.
This field became known as behavioral economics, of which behavioral finance and its focus on markets and investing, is a subset. And from this area of study in particular, I kept coming across the concept of how irrational humans could be when it came to making decisions involving money, risk, or any kind of uncertainty.
Our Brains On Risk
Collectively, these three areas of knowledge and research convinced me I could tell people “your brain wasn’t made to trade” because they were proof that what our brains were made for had far more to do with the challenges of survival millions of years ago, and not those of the peculiar task of staring at a screen of prices moving up and down as our money followed.
When we watch green and red arrows flickering, as our wealth flashes before our eyes, we could suddenly do some very dumb things with our money and investments. For me, since I learned this first by watching traders and then learning to trade, the two fields of science simply confirmed what smart traders “learned and earned” the hard way.
And the two sciences helped define and confirm my thesis from completely different angles. The neuroscientists are studying behavior from the inside-out, using brain structures, functions, and biochemistry to explain why we act the way we do. The behavioral economics researchers were classifying our tendencies from the outside-in by conducting repeatable experiments and “decision situations” that could then be used to infer patterns of human thought, emotion, and action.
For the behavioral outside-in crew, many of their key insights are summed up in a giant collection of patterns and theories known as cognitive biases, also known as heuristics (mental short-cuts or rules of thumb). Buster Benson recently did a great breakdown here on Medium to help organize the nearly 200 biases one might find on Wikipedia’s page List of cognitive biases.
According to Wikipedia, “cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.”
I’ll spend the rest of this post talking about the outside-in approach of the behavioral researchers. Here was a list of some of the top biases I shared in my 2008 article…
Availability heuristic: how people wrongly evaluate the probability of an event by relying on easily remembered past occurrences, where vividness overshadows rarity. This is also known as the “recency effect,” whereby not-too-distant memories of significant gains or losses affect new investment decisions.
Overconfidence bias: a result of the availability heuristic in which recent successes affect judgement about new risk taking.
Anchoring and adjustment heuristic: how people refer to and use a starting value or “anchor” to make estimates and decisions, and adjust their views based closely on it. For investors, this might manifest as using irrelevant data such as a investment’s purchase price, all-time high or year-end value as a reference when current conditions or risk should be more important.
Confirmation bias: the tendency to seek, consciously or not, information that verifies one’s decisions or beliefs.
Hindsight bias: the tendency to overemphasize or overestimate what could have been predicted after the outcome is known. Closely associated with this bias is the narrative fallacy, where we seek to explain how an outcome “obviously” came to be as the result of connecting dots we can now see in the story.
Outcome bias: the tendency to judge an investment decision based on its hoped-for outcome, rather than on current conditions or initial soundness.
Survivorship bias: misjudging methods as successful on the basis of positive results (vs. randomness). Every game has a winner, whether skilled or lucky.
Nominal fallacy: the mistaken belief that giving something a name explains it. This list, and the author’s entire article, could be guilty of this fallacy.
The Science of Heuristics and Bias
The fusion of psychology and economics known as behavioral finance owes much of its genesis to the aforementioned Daniel Kahneman and his fellow-researcher, the late Amos Tversky (Kahneman’s 2002 Nobel was for their joint contributions to economics as Tversky died of cancer in 1996). They considered themselves scientific psychologists in the sense that they were doing research, not counseling patients.
Since the 1960s, they and their ilk have been posing questions and conducting repeatable experiments (without brain imaging technology, of course) that attempt to uncover the mysteries of how supposedly rational investors and consumers make decisions involving money and risk.
They have stood squarely, defiantly in the path of those economists of the “efficient market hypothesis” school because they prove that the majority of people are typically irrational in their decision making about money and risk.
In one of Kahneman’s and Tversky’s more famous series of experiments, they posed different financial scenarios to participants and asked them to make decisions based on the situations. Two problems and their results show how inconsistent and irrational people can be about probable outcomes.
Answer these problems yourself, one at a time and let’s see how rational you are…
Problem 1: You have just been given $1,000 and must choose between the following options:
A) A sure gain of $500.
B) A 50% chance to gain $1,000 and a 50% chance to gain nothing.
Did you make your choice? If so, you may proceed…
Problem 2: You have just been given $2,000 and must choose between the following options:
A) A sure loss of $500.
B) A 50% chance to lose $1,000 and a 50% chance to lose nothing.
What did you choose for each? Do you recognize that the two problems are identical in term of probable net cash?
You either accept a certain $1,500 with options (A) or you gamble on a 50/50 bet of ending up with $1,000 or $2,000 with the (B) options. The only difference between the two problems is how they are framed. One is presented in the context of how much you gain, the other in terms of how much you lose.
What Kahneman and Tversky found was that 84% of subjects chose (A) for problem 1 and 69% chose (B) for problem 2. Their conclusion: Based on these results and and dozens of other similar experimental problems, people have a strong bias to avoid any amount of loss, regardless of risk (in this case, a 50/50 chance of losing half), and they will prefer a certain gain over a random, coin-flip gain of twice as much.
In short, people are not exactly rational about risk and reward. Faced with a sure gain, most become risk-averse, happy to put real money in their pockets (choice A in problem 1).
But faced with a sure loss, they suddenly become risk-takers, eager to take the gamble that they could escape loss altogether (choice B in problem 2).
The Golden Rule of Trading
For traders, this translates into acceptance of gains that are smaller than losses, on average. And that means financial ruin in the long-run for short-term speculators.
The so-called “golden rule of trading” is this: Cut thy losses short and let thy winners run.
My point with saying that “your brain wasn’t made to trade” is to highlight how it’s quite possible that our brains are “hard-wired” to do exactly the opposite of the golden rule. Our brains were “made” in prehistoric times to find food, avoid danger, find mates, and avoid too much labor or complexity. Times were simpler then. Cavemen and women didn’t have to think about saving for retirement, much less managing complicated investments with wildly random price swings.
Many neuroscientists don’t like to speak of the brain as being “hard-wired” for anything, but it serves my purposes here, as an amateur brain observer-commentator, to make the point about what we are “naturally” good at and where extra training is required.
In fact, in 2004 — even before my thesis was fully complete based upon reading the research of Antonio Damasio, Professor of Neuroscience, Psychology and Philosophy at the University of Southern California and an adjunct professor at the Salk Institute — I created a training with a trading simulation to uncover probability-based skills, or the lack thereof. More on that in a future post.
Damasio is well known for his research that shows how emotions play a central role in social cognition and decision-making. And that brings me to my recent podcast with Denise Shull of The ReThink Group, who was influenced by Damasio to expand her own work (MS in the Neuroscience of Emotion from University of Chicago) for teaching and coaching traders how to use their brains for better performance.
Shull was struck by the undeniable conclusions of Damasio’s research that if parts of the brain associated with emotional processing were damaged, or otherwise disabled, we wouldn’t be able to make decisions at all, or at least not with the effective ease that we do hundreds of times per day.
So to her, all the trading coaches and books suggesting that one had to eliminate emotion from trading to be disciplined were completely wrong-headed, pardon the pun.
My podcast Mind Over Money is on iTunes and episode #2 where I interview Denise Shull can be found here.
I have a lot more to share from my 2008 article and other related good stuff about behavioral economics, neuroscience, and Market Wizards. Until then, check out my podcast episode #3 where I tell the story about why Daniel Kahneman calls his former student and co-researcher Richard Thaler, author of Misbehaving, the “laziest” person he knows.