“Ignorance,” wrote Charles Darwin in 1871, “more frequently begets confidence than does knowledge.”
Darwin’s insight is worth keeping in mind when dealing with the current coronavirus crisis. That includes those of us who are behavioral scientists. Overconfidence—and a lack of epistemic humility more broadly—can cause real harm.
In the middle of a pandemic, knowledge is in short supply. We don’t know how many people are infected, or how many people will be. We have much to learn about how to treat the people who are sick—and how to help prevent infection in those who aren’t. There’s reasonable disagreement on the best policies to pursue, whether about health care, economics, or supply distribution. Although scientists worldwide are working hard and in concert to address these questions, final answers are some ways away.
Another thing that’s in short supply is the realization of how little we know. Even a quick glance at social or traditional media will reveal many people who express themselves with way more confidence than they should. Legal scholar Richard A. Epstein of the Hoover Institution incorrectly claims to have expertise in epidemiology. His predictions were proven false within a week. The president’s son-in-law Jared Kushner, who’s reportedly directing the White House response, has no domain-relevant expertise or training—but does not let that fact stop him from contradicting and overruling U.S. health officials. Peter Navarro, White House trade advisor, believes his training in social science gave him all the tools required to assess medical science.
Less spectacular examples of overconfidence abound. Many commentators speak as though they just know what policy approach is best in the face of the coronavirus. Nobody is yet in a position to know. This is not to say that we should postpone decision-making until we know all the facts: decisions often need to be made with imperfect information. It is to say that we won’t have knowledge about what happened and what worked until after the outbreak is over and temporary measures lifted.
Frequent expressions of supreme confidence might seem odd in light of our obvious and inevitable ignorance about a new threat. The thing about overconfidence, though, is that it afflicts most of us much of the time.
Frequent expressions of supreme confidence might seem odd in light of our obvious and inevitable ignorance about a new threat. The thing about overconfidence, though, is that it afflicts most of us much of the time. That’s according to cognitive psychologists, who’ve studied the phenomenon systematically for half a century. Overconfidence has been called “the mother of all psychological biases.” The research has led to findings that are at the same time hilarious and depressing. In one classic study, for example, 93 percent of U.S. drivers claimed to be more skillful than the median—which is not possible.
“But surely,” you might object, “overconfidence is only for amateurs—experts would not behave like this.” Sadly, being an expert in some domain does not protect against overconfidence. Some research suggests that the more knowledgeable are more prone to overconfidence. In a famous study of clinical psychologists and psychology students, researchers asked a series of questions about a real person described in psychological literature. As the participants received more and more information about the case, their confidence in their judgment grew—but the quality of their judgment did not. And psychologists with a Ph.D. did no better than the students.
Being a true expert involves not only knowing stuff about the world but also knowing the limits of your knowledge and expertise.
Being a true expert involves not only knowing stuff about the world but also knowing the limits of your knowledge and expertise. It requires, as psychologists say, both cognitive and metacognitive skills. The point is not that true experts should withhold their beliefs or that they should never speak with conviction. Some beliefs are better supported by the evidence than others, after all, and we should not hesitate to say so. The point is that true experts express themselves with the proper degree of confidence—meaning with a degree of confidence that’s justified given the evidence.
Compare Epstein, Kushner, and Navarro’s swagger to medical statistician Robert Grant, who tweeted: “I’ve studied this stuff at university, done data analysis for decades, written several NHS guidelines (including one for an infectious disease), and taught it to health professionals. That’s why you don’t see me making any coronavirus forecasts.”
The concept of epistemic humility is useful to describe the difference between these two kinds of character. Epistemic humility is an intellectual virtue. It is grounded in the realization that our knowledge is always provisional and incomplete—and that it might require revision in light of new evidence. Grant appreciates the extent of our ignorance under these difficult conditions; the other characters don’t. A lack of epistemic humility is a vice—and it can cause massive damage both in our private lives and in public policy.
Calibrating your confidence can be tricky. As Justin Kruger and David Dunning have emphasized, our cognitive and metacognitive skills are intertwined. People who lack the cognitive skills required to perform a task typically also lack the metacognitive skills required to assess their performance. Incompetent people are at a double disadvantage, since they are not only incompetent but also likely unaware of it. This has immediate implications for amateur epidemiologists. If you don’t have the skill set required to do advanced epidemiological modeling yourself, you should assume that you can’t tell good models from bad.
Epistemic humility is an intellectual virtue. It is grounded in the realization that our knowledge is always provisional and incomplete—and that it might require revision in light of new evidence.
There has never been a better time to practice the virtue of epistemic humility. This is particularly true for those of us with some kind of expertise—perhaps as behavioral scientists—and who want to make ourselves relevant and useful. Our knowledge and expertise are required to deal with the challenges ahead. But we’re doing nobody a favor when pretending to know more than we actually do.
And it’s never been more important to learn to separate the wheat from the chaff—the experts who offer well-sourced information from the charlatans who offer little but misdirection. The latter are sadly common, in part because they are in greater demand on TV and in politics. It can be hard to tell who’s who. But paying attention to their confidence offers a clue. People who express themselves with extreme confidence without having access to relevant information and the experience and training required to process it can safely be classified among the charlatans until further notice.
People who express themselves with extreme confidence without having access to relevant information and the experience and training required to process it can safely be classified among the charlatans until further notice.
If expertise does not protect against overconfidence, what does? Research in fact suggests one simple thing that everyone can do. It is to consider reasons that you may be wrong. If you want to reduce overconfidence in yourself or others, just ask: What are the reasons to think this claim may be mistaken? Under what conditions would this be wrong? Such questions are difficult, because we are much more used to searching for reasons we are right. But thinking through the ways in which we can fail helps reduce overconfidence and promotes epistemic humility.
Again, it is fine and good to have opinions, and to express them in public—even with great conviction. The point is that true experts, unlike charlatans, express themselves in a way that mirrors their limitations. All of us who want to be taken seriously would do well to demonstrate the virtue of epistemic humility.