This article originally appeared in Institutional Investor on April 24, 2018.
The human brain is hard-wired to confirm biases, overweight the importance of recent information, and run with the herd. But a new book shows how to get around our flawed programming.
In last month’s column, I implied that the Yankees will win the World Series this year by maximizing their home field advantage. Acquiring power-hitter Giancarlo Stanton, whose hitting patterns are custom-made to thrive in Yankee Stadium, will lift the talented, young team over the top, I predicted.
My story sounded great, but I recently learned that it’s statistically incorrect. In the book Scorecasting, Jon Wertheim and Tobias Moskowitz crunched the numbers and determined that baseball’s home field advantage does not result from the cast of characters on the field. Instead, it is almost entirely attributable to the subconscious behavior of umpires, who tend to favor the home team on the margin when calling balls and strikes.
We are all hard-wired to make flawed decisions, the root cause of which comes from evolution. Daniel Kahneman, in Thinking, Fast and Slow, describes what he calls our System 1 and System 2 brains. Our System 1 brain moves quickly and instinctively, dominates most of our decisions, and often gets things wrong. Our System 2 brain slows down and assesses truth but is lazy and requires nudging to activate.
The world of behavioral finance has defined a host of problems we encounter as investors, including loss aversion, confirmation bias, mental accounting, recency bias, hindsight bias, and herd mentality. Annie Duke, former top professional poker player and author of Thinking In Bets, reveals that set-up is even worse than it sounds. We think that we hear a statement or hypothesis, contemplate whether it is true, and then decide its merit for ourselves.
But that’s not how we roll. We actually hear something, believe it to be true, and only then decide whether or not to activate our System 2 brain, gather information, and think through the merit of the statement.
To add insult to injury, we suffer from motivated reasoning. When we find information that disagrees with our belief, we try incredibly hard to discredit the information. And the smarter we are, the further we get from objective truth, because we are that much cleverer in rationalizing away disconfirming evidence. When good outcomes occur, we take credit for making good decisions; when bad outcomes occur, we write it off as bad luck.
Our daily investment process encompasses a range of decisions, from as small as how we spend our time to as large as making a big recommendation to an investment committee. The science of behavioral finance, while intellectually stimulating, offers few remedies to correct for our decision-making flaws. We may know we suffer from bias, but what should we do about it?
Fortunately, Annie has cracked some of the code to improve our decision-making process. Although we can’t change our wiring, we can follow a series of hacks to circumvent our bad instincts and work toward objective truth.
The title of Annie’s book, Thinking in Bets, offers a key to unlock the mystery of our brain. We tend to make proclamations that imply certainty, even though almost every decision is uncertain. When encountering a statement of certainty, Annie prescribes retorting, “Wanna Bet?”
The simple phrase activates our System 2 brain and makes both the proposer and recipient think twice about the statement.
It really works. My 12-year old daughter recently professed we shouldn’t see the movie The Greatest Showman because she knew it would be terrible. When I responded, “Wanna Bet?”, she paused, stood down from her aggressive stance, and said “Well, no.” We proceeded to see the movie, and all loved the film (a rare consensus assessment from three very different children).
Rather than state our opinions, we can active our System 2 brains — and those of our team members — by placing any statement in the context of a probability distribution. When a CIO opens a staff meeting with “Equities are overvalued and we should lighten up,” the discussion in the room that follows differs vastly from opening the same conversation with “I think there’s an 80 percent chance equities are too expensive. We should take down some risk.”
Communicating in this way offers subtle benefits. First, leaders implicitly invite their team to collaborate. When a leader implies a certainty, team members might not share contrarian views. In the moment, a valuable counterpoint may get excluded from the discussion as the member of the team cowers under the fear and embarrassment of being wrong. After all, we believe what we hear and only later check the facts.
Second, leaders are perceived as authentic and credible. Admitting the absence of certainty fosters familiarity and confidence from team members and allows leaders to express layers of insight in their thought process.
Last, the path to probabilistic thinking enables team members to learn about blind spots in their own thinking. We all have blind spots — we’re just much better at spotting them in our neighbors than in ourselves. One way we can overcome this bias is to form small decision-making groups in which we can hold each other accountable for seeking the truth. Think of it as a watered-down, digestible version of the Bridgewater culture.
At the end of the day, I still believe that the Yankees will win the World Series this year, in part because Stanton will be so proficient at the House that Ruth Built. How sure am I? About 60 percent.