Changing our minds is hard, even in the most favorable conditions. There’s the risk of looking inconsistent or like you lack conviction; if you’re a politician, a flip-flopper. But there’s more to it than that.
Changing your mind, more often than not, requires you to grapple with your own identity. Admitting that you were wrong feels personal. We have to face the fact that we’ve been walking around the world all this time believing in something that isn’t true. Even worse, we have to admit that we’re the type of person who walks around being wrong. We know what we think of other people who do that—ugh, how embarrassing!
And yet, how freeing it is to admit we were wrong or that we don’t know something. A weight suddenly lifted from our minds, like telling the truth after holding in a lie. But not only freeing, valuable too. No longer burdened by the need to be right, we have the chance to learn something new, and to better understand the world.
Psychologist Adam Grant wants to make that freeing feeling easier to come by and the rewards easier to reap. In his latest book, Think Again: The Power of Knowing What You Don’t Know, Grant investigates why we struggle to update our ideas and opinions and how we can get better at it. The book, he writes, “is an invitation to let go of knowledge and opinions that are no longer serving you well, and to anchor your sense of self in flexibility, rather than consistency.”
It’s the idea of flexibility and how to achieve it that I found most compelling in Think Again. As I read the book, I couldn’t help but reflect on the times I’d clung to an opinion past it’s expiration date or imagine what I might have learned from a debate, had I asked a question instead of hurling a rebuttal.
I wanted to learn more, so I spoke with Grant to find out how we can develop the ability—and flexibility—to rethink. We discussed how rethinking became a part of his identity, the lesson Daniel Kahneman taught him about being wrong versus having been wrong, what he’s had to rethink over the past year, and why he no longer thinks seeing both sides of an issue can alleviate polarization.
Our conversation has been edited for length and clarity.
Evan Nesterak: You were part of a group of college students who created a precursor to Facebook. Obviously, you didn’t pursue that idea, and we know what happened with Facebook. You write that that experience helped you see the value of rethinking, and since then it’s become part of your sense of self. Take me through that experience.
Adam Grant: I remember being a relatively early adopter of Facebook, the year after I graduated. It must have been ’04 or early ’05 that I joined. At first, it didn’t even occur to me that it was similar to some things we had been doing in our e-group [connecting incoming students at Harvard]. And then as it took off, I thought, You know, it seems like people are using this to connect with their future classmates, that’s exactly what we did. They’re also having the same kinds of conversations. Something feels familiar about this.
As Facebook skyrocketed, I started to realize that I had made a whole bunch of assumptions that I just never bothered to question. It was sort of stunning to me, looking back, that it never occurred to me that a college student could be an entrepreneur, even though I knew Bill Gates had gone and done that from the same campus. It never occurred to me that I could be an entrepreneur. It never occurred to me that this was technology that might be useful in a lot of schools, and for people who weren’t college students, and even for people who lived in the same town. I think that that was definitely an early seed planted to say, I need to get better at identifying my assumptions as a first step, and then being willing to rethink them.
Well, I hate to bring this up, but in Originals you mention you had the chance to invest in Warby Parker. So walk me through passing on these two ideas—what’s the gap here?
[Laughs] Well, Warby Parker was not my idea. I happened to have one of the founders in one of my classes. So I take the blame for not seeing the potential in the idea and in the cofounders.
I’ve taken two broad lessons from both of these experiences. One is that it is really easy to do your rethinking in the rearview mirror, to look back and say, Well, I should have been more open to, let’s say in the Warby case, the idea that maybe hedging your bets and having a backup plan is a sign of intelligence, as opposed to a sign of being noncommittal. Maybe taking a long time to name your company is not a signal that you’re a procrastinator, maybe it’s a clue that you really care about getting this right, and you’re busy testing 2000 different naming ideas, because you’re going to build a brand. In hindsight it’s so clear, but it’s hard to see it in the moment.
I think the other thing that jumps out to me is there are always people in these moments who challenge our thinking. What I’ve done in too many of these situations is dismiss them, because they didn’t agree with me.
If somebody sees an idea, or an opportunity, or forms an opinion that is different from mine, I should say, This is an interesting opportunity to learn something from someone who sees things differently from me, and I wonder if they know something I don’t.
In the e-group case, we shut down the fall of ’99, my freshman year, just completely stopped using it, cold turkey. But then two years later, this is fall of 2001, one of my roommates, who wasn’t in the e-group (he came from Germany, and nobody knew he was going to be one of our classmates, when we built the e-group), he said, You know, I would really love to build a portal where people could share party information, they could get in touch with their classmates, if they wanted to organize a study group, they could make sure that everyone knows about all the different clubs on campus. He was dreaming up something that looked very similar to what Facebook did when it launched. He did the rethinking of our e-group that I didn’t do, and I still couldn’t see it.
What I learned from that is if somebody sees an idea, or an opportunity, or forms an opinion that is different from mine, I shouldn’t default to the assumption that I’m right and they’re wrong. I should say, This is an interesting opportunity to learn something from someone who sees things differently from me, and I wonder if they know something I don’t. I guess it’s been a lesson in intellectual humility, hasn’t it?
Daniel Kahneman taught you a lesson about the importance of not being overly attached to an idea. Can you explain what happened and what you learned from him?
I gave a talk on some of my research on givers and takers. I didn’t realize that Danny Kahneman was in the audience. He is a living legend, and one of the great social scientists of all time. I’m doing this double take as I’m walking offstage, and Danny is there. He stops me, and he says, “That was wonderful. I was wrong.” His eyes twinkled as he said it, and he lit up.
Danny is not somebody who walks around beaming all the time, so I was struck by the reaction and intrigued by these two sentences that normally would contradict each other. Normally, what you expect people to say is, “That was wonderful, I was right.” Or, “Actually, you’re wrong. Let me tell you why.”
I ended up sitting down with him and asking him to explain this reaction. I said, I’ve seen this a couple times—I’ve seen you make predictions, people end up running the experiment and you see something that’s not what you expected, and you seem to really take joy in being wrong. The first thing he said—which I didn’t capture in the book, but I will tell you, because I think it’s an important distinction—he said something to the effect of, No one enjoys being wrong, but I do enjoy having been wrong, because it means I am now less wrong than I was before.
Daniel Kahneman said something to the effect of, No one enjoys being wrong, but I do enjoy having been wrong, because it means I am now less wrong than I was before.
It was a lightbulb moment for me because it reminded me of what first got me interested in psychology—being a freshman in college and reading all these studies that contradicted my expectations.
But what’s different about Danny is he seems to do that even when his core beliefs are attacked or threatened. He seems to take joy in having been wrong, even on things that he believes deeply. And so I asked him about that—why, and how? On the why question, he said, Finding out that I was wrong is the only way I’m sure that I’ve learned anything. Otherwise, I’m just going around and living in a world that’s dominated by confirmation bias, or desirability bias. And I’m just affirming the things I already think I know.
On the how part, he said for him it’s about attachment. He thinks there are good ideas everywhere, and his attachment to his ideas is very provisional. He doesn’t fall in love with them, they don’t become part of his identity.
That ability to detach and say, look, your ideas are not your identity. They’re just hypotheses. Sometimes they’re accurate. More often, they’re wrong or incomplete. And that’s part of what being not only a social scientist, but just a good thinker is all about.
This past year crushed a lot of our assumptions about the world, particular with the pandemic, protests, and politics. What’s been on your mind as you watched the year unfold? Did 2020 make you rethink some of the things you thought you knew about rethinking?
One thing I rethought in a major way while writing this book was really fueled by what was going on politically—the both sides idea. I came in assuming that the best solution to the polarization problem was to show people the other side.
I had a student a few years ago actually who had started a media company called Polar News. It was going to show the MSNBC and the Fox headlines side-by-side in the daily newsletter, so that we could burst filter bubbles and break out of echo chambers. And I was convinced we needed to do more of that—the traditional media needed to do it, that social media needed to have algorithms that rewarded it.
I’ve completely rethought that. I now think that the both-sides perspective is not part of the solution, it actually exacerbates the polarization problem. That’s largely because it’s so easy for us to fall victim to binary bias, where you take a very complex spectrum of opinions and attitudes, you oversimplify it into two categories, and when you do that you know which tribe you belong to; the other side is clearly wrong and maybe bad too. It just locks people into preaching about why they’re right and prosecuting everyone on the other side for being wrong.
I do not want to have both-sides conversations anymore. Whenever somebody says, here’s the other side, my first question is, Can you tell me what the third angle and the fourth look like?
What I now believe is that we need to complexify. I think Peter Coleman’s research is brilliant on this in the Difficult Conversations Lab. I love the fact that he shows that you could take the same description of an issue like gun safety versus gun rights, and if you take the information and present it as, Here are the arguments on the gun safety side, here are the arguments on the gun rights side, you can get just under half of abortion opponents to agree to write and sign a joint statement together.
That that sounded encouraging at first—like, wow, 45, 46 percent of people who violently disagree about abortion, after reading an article about the two sides of the gun issue were able to get on the same page, good news. Except then I read 100 percent of those abortion opponents were able to agree on and sign a joint statement, if that exact same information about the different topic gun control was presented not as two sides of the coin, but through many different lenses of a prism.
So the arguments are basically mixed up and you’re told there are a lot of nuances on this topic. Some people who are very pro gun rights are strongly in favor of universal background checks. Some people who are pro gun control, they’re believers in the Second Amendment. Seeing that complexity, seeing the nuance and saying I don’t have to just belong to one camp seemed to make people a little bit more open minded and flexible in thinking for themselves.
That’s something I’ve rethought. I do not want to have both-sides conversations anymore. Whenever somebody says, here’s the other side, my first question is, Can you tell me what the third angle and the fourth look like?
Some of your research has touched on conflict between opposing groups. In one instance, you managed to do the impossible—get Red Sox and Yankees fans to like each other. Can you explain more about your work on reducing conflict among opposing groups?
I actually have some new data that didn’t make the book. Tim Kundro and I recruited gun rights and gun safety advocates. We gave them a version of what we did with the Red Sox and Yankees fans, so we asked them, importantly, not to do perspective taking. Not to imagine why the other side feels the way they do. But rather to do counterfactual thinking—to reflect on how, if their own life circumstances had been different, they might hold different views.
I think when we encounter people who disagree with us on charged issues, I could imagine having grown up in a family or in a country, or in an era, where, because of my experiences and the people that I knew, I might believe different things.
In the gun control condition, we said think about how you might have a different stance on guns if you’d grown up in a hunting family. For the gun rights activists, we said think about how your views on guns might be different if you had grown up in Columbine. That was enough to significantly reduce the animosity that they showed toward the other side, or toward the opposite extreme.
I think when we encounter people who disagree with us on charged issues, it is worth thinking about no matter how passionately I feel about a given issue, I could imagine having grown up in a family or in a country, or in an era, where, because of my experiences and the people that I knew, I might believe different things. That allows me to be open to rethinking my animosity toward people who believe those things. It allows me to recognize that their beliefs have the capacity to change, just like mine could have.