Mark Twain famously quipped, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”
Today’s consultants might describe the same phenomenon as confirmation bias in the decision-making process. It’s one type of cognitive bias, defined as “the tendency to make systematic errors in certain circumstances based on cognitive factors rather than evidence.”
Every leader has internal biases, some of them subconscious or hidden, which can create especially tricky traps that complicate sound decision making. Several books on the topic can be found in The Library in St. Pete, and we have written on various manifestations of the problem here:
- Our “Vintage Future” series (I, II, and III) takes a partially tongue-in-cheek look at the history of expert predictions in technology and investing.
- In It’s easier to rationalize than to be rational we recommend strategies for overcoming the “compulsive yes man” inside our heads.
- Inside the mind of great entrepreneurs draws a distinction between corporate managers and entrepreneurs: one believes that to the extent he can predict the future, he can control it; the other believes that to the extent he can control the future, he doesn’t need to predict it.
Duncan Watts’ new book explores the same reality Twain observed using humorous real-world stories Twain could have invented. Here we offer two brief excerpts, originally published as part of Christopher Chabris’ book review in The Wall Street Journal. One exposes common sense as “a shockingly unreliable guide to truth [that we rely on] virtually to the exclusion of other methods of reasoning,” while the other recounts the modern history of the Mona Lisa to demonstrate how common sense “is also inclined to conclude that individual successes (and failures) are determined by inherent qualities rather than by unpredictable circumstance.”
During World War II the U.S. military surveyed 600,000 soldiers for a research project. Two of its many findings were that better-educated soldiers suffered more psychological distress from their wartime experience than their less-educated comrades and that soldiers from rural areas were happier than those from urban backgrounds. These conclusions are hardly surprising: Effete intellectuals should have more trouble handling the stress of war, and farmers are more accustomed than city folk to harsh, army-like conditions. What could be more obvious? A grandstanding politician could easily denounce the entire study—or the entire enterprise of social-science research—as a massive waste of money on the basis of “discoveries” like these.
Wait, change that: The military study actually arrived at the opposite conclusions. The sociologist Paul Lazarsfeld—aiming to show how “common sense” justifications can be found for almost any conclusion—pulled the switcheroo in a 1949 review of the survey’s results. In fact, educated soldiers were less troubled than uneducated ones, and urban soldiers were happier than their rural counterparts. The real findings are just as explainable as the fake ones; perhaps education equips us to cope with stress and urbanites are more accustomed to living in close quarters.
Mr. Watts asks why the “Mona Lisa” is the most admired painting in the world today—why most people believe it to possess unique, timeless features that set it apart… Before the 20th century, the “Mona Lisa” wasn’t even the most popular painting in the Louvre. But in 1911 it was stolen, smuggled to Italy and exhibited widely before being returned to France, whereupon Marcel Duchamp defaced a reproduction of it and labeled his work with an obscene pun. The painting rocketed to fame, its pigments and brushstrokes unchanged. The “Mona Lisa” is the artistic equivalent of the investor who did nothing special until he got lucky a few years (or quarters) in a row and was fêted as a genius. Ecclesiastes told us that time and chance happeneth to all, but we easily forget.
Chabris acknowledges that it is “rarely practical to run the perfect experiment” before making a decision but insists we can be “more deliberative and reflective as we gather and analyze facts to inform our decisions.” When we over-rely on common sense alone, we risk “rejecting a more thorough effort to solve a problem and settling for an easy one.”