As a preview to my new course on intelligence analysis, How to Avoid Strategic Mind Traps, I’m examining the various categories of cognitive bias that mangle our decision-making on such a regular basis. In yesterday’s post, we looked briefly at decision-making bias itself, distortions created from our wonky process in making choices. At the heart of many of these problems is this next category, probability bias, which is our innate inability to properly understand risk.
My favorite example to use when illustrating this foible is purchase decisions at the gas station. You’ve got two products: a lottery ticket, and fuel injector cleaner. One is almost certain to result in no benefit at all, but features a mathematically-insignificant chance for outsized benefit.
The other costs $7.99 and will probably save you $20 – $100 in fuel efficiency over the mid-term.
If that billion-dollar PowerBall is any indication, we know which choice is more attractive to humans.
Probability bias: Located approximately everywhere
A mathematics profession once quipped to me, regarding casinos, “Luck is for those people who take probability statistics personally.” Well, there are plenty of casinos out there, so investors have clearly bet on this human characteristic being fairly evergreen. If you’re in the competitive intelligence, foresight, or strategy field, you might come to a similar conclusion.
As such, here are a few probability biases to look out for in the field:
-
Survivorship Bias
“Since this company made a crazy bet and became huge, surely your crazy bet will also pay off.”
Survivorship Bias is the tendency to only study those who won using certain strategies, while ignoring the significant number of losers. For example, just because Nokia bet most of its assets on the cell phone market doesn’t mean that all bets on a single market space would necessarily pay out.
This phenomenon is very well described in Michael Raynor’s 2007 book The Strategy Paradox.
-
Authority Bias
“A whole lot of very serious people think that invading Iraq isn’t insane, therefore the plan stands a good chance of success.“
Authority bias is the tendency to increase the perception of likelihood of a given event just because a person of higher social rank has publicly expressed their belief in that likelihood.
For example, despite knowing the proscription against land wars in Asia from The Princess Bride, many people still thought, erroneously, the Iraq War wasn’t nuts because men in nice suits said it going to go swimmingly.
-
Ostrich Effect
“I just don’t see the problem. I mean, I hear what you’re saying, but I just don’t see the problem.”
The Ostrich Effect, seen in timid executives everywhere, is depicted above.
-
Extreme Performance Bias
“You think that 15% YOY growth is unsustainable. I think it’s the NEW AWESOME!“
Extreme Performance Bias is the tendency assume that exceptional performance is durable and unlikely to return to the statistical mean.
The best and most destructive example of this cognitive bias happened in the 2000s when real estate prices skyrocketed irrespective of wages or other exogenous economic factors that would have supported it. Analysts took it – stupidly – as the “New Normal” and not as evidence of rampant criminality and deluded risk-taking.
(NOTE: Not this analyst, but many analysts.)
Probability bias is the last category we’ll be covering at the International Competitive Intelligence Conference in Bad-Neuheim, Germany on April 22, 2016.
To sum up, in the next post, we’ll look at one of the great and tragic case studies of cognitive bias in recent memory, the U.S. intelligence community’s disastrous performance in the run-up to Operation Iraqi Freedom.
Sign up for “How to Avoid Strategic Mind Traps” now!
