Skip to main content

Overconfidence Can Blindside Science and Society Alike. Here's How Not to Get Fooled

The tale of how the "backfire effect" ultimately, itself, backfired, and what scientists can learn from being wrong

An illustration of an iceberg with a a prism of light flowing under it and the word Uncertain

Anaissa Ruiz Tejada/Scientific American

[CLIP: Voices read hate mail that Christie Aschwanden has received: “You should not write. You are no scientist. You are a foolish person.” “Your article could cause new breast cancer deaths.” “Get beyond your pathetic left-wing angst over the envirofascist lies.” “What are your qualifications? I looked up your bio and was totally unimpressed.” “You call yourself a ‘Science Writer’??!! Your article was all lies.” “I have no clue where you got your info on vitamins. There is no doubt in my mind that they help me in a multitude of ways! Bashing them for no good reason??” “Christie Aschwanden should not be ‘aloud’ to write for any magazine.... Maybe she has no friends; maybe that’s the real problem here.”]

[CLIP: Theme music]

Christie Aschwanden: I’m Christie Aschwanden, and this is Uncertain from Scientific American.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Today we’ve got a show in two acts exploring how studies can make us so overconfident that we lose sight of the underlying uncertainty.

Act 1: The Backfire Effect

Back in 2011 I gave a short talk at a journalism conference about what I had learned from hate mail. The bottom line, I said, was that if you tell people that they’re wrong about something they hold dear, they will immediately get defensive.

And if you’re a journalist like me, they will send you nasty letters like the ones you heard at the top of the show.

I based my argument on my own personal experience having written about things like climate change and cancer screening and, most controversially, the uselessness of multivitamins. But I also backed it up with science—specifically, some research by Brendan Nyhan, a political scientist at Dartmouth College.

Here’s how I explained it in the talk:

Nyhan studied what happens when you tell people that they’re wrong. And what happens is you go back and think about all the reasons you were right in the first place. And in the process of doing this, you become more sure of yourself.

This became known as the “backfire effect.”

Here’s Brendan Nyhan.

Brendan Nyhan: After I had graduated from college, I had worked as one of the founders of a nonpartisan fact-checking website called Spinsanity that was one of the original online fact-checkers. And we ran from 2001 to 2004. And that ended up putting us in the middle of the first of the war in Iraq and the aftermath of 9/11 in the conspiracy theories around that.

Aschwanden: Oh, yes, the 9/11 conspiracies. There were a lot of those!

Nyhan: Then you fast-forward to the end of the Bush administration—Barack Obama is running for president; the Obama Muslim myth starts spreading. That becomes a very salient misperception. The first two years of the Obama administration, we have the Affordable Care Act. So the death panels myth is a hugely influential policy misperception, and the Obama Muslim myth morphs into the birther myth.

Aschwanden: So people like you are trying to correct these misperceptions and not getting much traction.

Nyhan: People were trying to understand “Why are these so prominent?” and “Why are our efforts to fight these not working?”

Aschwanden: Your studies used undergraduates as test subjects. What was the setup?

Nyhan: Each of the studies involved a mock news article about a salient political misperception. The experiment was whether that mock news article randomly included corrective information debunking the claim made by the featured political elite or not.

Aschwanden: What kinds of misperceptions were you testing?

Nyhan: So, for instance, there’s, an example in the article is George W. Bush suggesting that his tax cut will reduce the deficit rather than increase it

[CLIP: President George W. Bush Presidential Radio Address, 2001: This pas week I’ve been making the case for tax reductions. I’ve asked Congress to act quickly on my tax relief plan, so that Americans can face these uncertain economic times with more of their own money.”]

Nyhan: Which is a finding that—even his own economists held that his claims were not accurate.

Aschwanden: So one group would be told that, well, actually, the Bush administration’s own economists say the tax cut will increase the deficit by many millions of dollars.

Nyhan: That was an experimental variation: Are you being exposed to that corrective information or not?

Aschwanden: And you found that people exposed to the corrective information, what we might now call a fact-check, actually ended up becoming more sure of the misperception rather than correcting it.

Nyhan: The finding was that two of the five studies we conducted showed evidence of this backfire effect.

Our interpretation, when asked, was that people might be thinking about reasons they held a belief in the first place and coming to double down on them as a result of being corrected.

Aschwanden: I loved the backfire effect! It felt so true and resonated so much with my personal experience. I was positive that this finding explained everything in the same way that I was sure that the dress from our previous episode was white and gold.

Nyhan: That finding turned out to be a real brain worm for all of us. It resonated with people’s experiences, like yours, in reminding people of the times when they’ve struggled to convince someone who seemed even more sure of their original belief. And we can all think of examples like that ... and as a result, it just resonated, and as we told, then retold the story in the way humans do, the scientific nuance was lost.

Aschwanden: It felt so true—like it didn’t need nuance.

Nyhan: I think you’re an extremely careful scientific journalist. So it’s not a criticism of you in any way.

Aschwanden: Well, thank you. That’s very kind.

Nyhan: But it’s like, as I tell the story, and then you tell the story, and, right, and as it propagates out there into the world..., we totally lost control of how it was described over time, and it just spread everywhere. And, well..., there really wasn’t better evidence, right? This was the study I’d conducted. And I reported what I found. And I described my best interpretation of it from a scientific perspective.

Aschwanden: Let’s take a moment to step back and take a closer look at what’s happening here.

Brendan’s group got some really intriguing data, but those are just raw results. The human brain doesn’t think in data; we think in stories. So he and his colleagues did what all scientists do—they created a story to explain their data. To do that, they had to do some interpretation and bring human judgment to bear.

Let’s listen again to what Brendan said a moment ago:

Nyhan: Our interpretation, when asked, was that people might be thinking about reasons they held a belief in the first place and coming to double down on them as a result of being corrected.

Aschwanden: Notice that? He said “people might be thinking about reasons they held the belief.” Because he’s a careful scientist, Brendan was cautious here to say that this is a possible explanation.

But it’s really just a guess, and he’s pretty careful not to overstate it. Whether or not the result itself holds up, this explanation of it is a hypothesis. We can think of it as a “just-so story.”

Rudyard Kipling’s classic collection of Just So Stories For Little Children provides fanciful explanations for things like how leopards got their spots and how camels got their humps.

Just-so stories are tales invented to explain observed phenomena, just like the explanations scientists present for their scientific results are invented to explain the data.

Until they’re tested, these stories are just best guesses. But because these explanations are created to perfectly fit the data, it’s easy to jump on the story and forget about all the uncertainty that surrounds it.

As it turns out, the backfire effect is a classic case of a very exciting first finding that we all jump on, and then, poof, it disappears in later studies.

Of course, it’s important to note that only two of the five original experiments found any evidence of the backfire effect. So there was a lot of uncertainty about it from the very first study.

Nyhan: Since then many other studies have been done by myself and my co-authors and by many other scholars finding the backfire effects seem to be exceptionally rare.

Aschwanden: But by now the genie is out of the bottle, and those follow-up studies never get as much media attention as the first time, right?

Nyhan: Those unfortunately don’t get the same kind of play.

Aschwanden: I feel like this is a little bit meta, right? You were presented with evidence that your original finding on the backfire effect might not hold up, and instead of doubling down, you’ve actually opened up and said, “Wait a second; let’s revisit this. Let’s put all the evidence together.”

Nyhan: Yeah, and you know, I’m grateful to all the people who interacted with me in a good-faith manner. And I think that helps us all be better people in having these kinds of exchanges, right? It never became personal; it never became vitriolic. And that helped, I think, keep the focus on the science in a way that helped me, hopefully, you know, respond in a more, in an appropriate way.

Aschwanden: It’s kind of the idealized version of how scientists should behave. I can easily imagine a different universe where you double down, and you maybe give a TED Talk and write a pop-science book about the power of the backfire effect. Instead you published a paper titled “Why the Backfire Effect Does Not Explain the Durability of Political Misperceptions.”

Nyhan: It’s a great reminder that we don’t have all the answers. I think it’s really changed how I think about science and how I do science.

Aschwanden: How has it changed the way you operate?

Nyhan: I hope it’s helped me approach this question with more humility because I had many preregistered studies at that point.

Aschwanden: Oh, right, preregistered studies are experiments where you lay out and commit to a specific plan.

You commit to how you will carry out the study and analyze the results before you begin.

This is meant to reduce the temptation to fiddle around with the analysis once the results start coming in. So what have these studies found?

Nyhan: The findings were clear that correct information did tend to reduce perceived accuracy of false claims, somewhat, right? And I had plenty of preregistered studies in which I had found exactly that.

Aschwanden: So does the backfire effect really exist?

Nyhan: You know, I wasn’t finding evidence of pervasive or substantial backfire effects, either.

Aschwanden: Since that talk 13 years ago, I’ve thought a lot about the backfire effect and what it says about science and how we know what’s true.

One thing that’s become clear to me is that it’s really easy to get a result. You did a study, and you found this thing. It’s a lot harder to get an answer to the underlying question.

When I was giving that talk about the backfire effect, I was trying to explain why people resist changing their beliefs when they are offered a correction. I really wanted a universal answer. But that’s asking for a lot. Were we expecting too much from the backfire effect?

Nyhan: It came to stand in for why this social phenomenon exists, right? And no scientific research could bear the weight of the demand for monocausal explanations for complex social problems. And this one certainly couldn’t.

Aschwanden: We want a single answer that explains everything, but maybe there isn’t one. I like to joke that the most scientific answer to any question is “it depends.”

Nyhan: I think there’s a learning opportunity for everyone, certainly, you know, learning opportunity for me as a scientist—I’ve done my very best in my own research to follow up on this study, to not let it be the last word, to publicize the emerging body of evidence that better represents the scientific consensus as time has gone on.

And I think maybe most importantly I’ve tried to not be defensive about my own study because, you know, “What kind of a researcher on misinformation would I be if I wasn’t willing to change my mind when the facts change?”

Aschwanden: That’s really noble of you, but we’re all human. It’s easy to feel defensive. How do you move away from the desire to be right toward a goal of finding the right answer?

Nyhan: I’ve gone on to work on a number of studies with [political scientists] Ethan Porter and Tom Wood, who’ve done some of the most important follow-up research in this area. And we’ve worked together collaboratively … instead of having one of these adversarial scientific disputes where each side says the other is wrong, we’ve forged a path to try to move the research literature in a more informative direction.

Aschwanden: What lessons can we take from the backfire effect saga?

Nyhan: The single splashy study lends itself to media coverage, right? It’s what the big general-interest science journals look for. But sometimes studies are surprising precisely because they don’t correspond to prior knowledge.

Aschwanden: Should we be more skeptical of really surprising findings?

Nyhan: We should be cautious that they may be anomalous, right? And our study turned out to be anomalous. I don’t know of an error we made. It’s science. It’s—our findings are provisional and subject to being overturned.

Aschwanden: This harkens to the hidden moderators we talked about in our previous episode—those unremarkable decisions that researchers make that may end up having a big influence on the results they get.

Has this experience changed the way you approach study design?

Nyhan: There are a number of factors that I would do very differently now, with, you know, the knowledge of more than 15 years since we conducted those studies. I guess what’s important to take away is: it doesn’t have to be wrong in the sense of it was conducted fraudulently or incorrectly. It can just be one data point among many.

Aschwanden: So a lesson here is to be careful about putting too much weight on a single study. It’s easy to focus on the exciting or surprising finding and forget about all the other ones that may be less interesting.

Here a lot of people—including me!—overlooked the fact that from the very start, the backfire effect was an outlier—it only showed up in two of the five experiments.

Brendan, how has this experience influenced the way you evaluate studies?

Nyhan: We interrogate the science that we discuss in my classes. I encourage the students to take apart the methodology of the research, the assumptions that the authors make, the flaws in their experimental designs.... I try to foster the kind of critical scientific inquiry that I try to practice as a scientist in my teaching in the classroom.

Aschwanden: You’re a political scientist. How does an understanding of uncertainty factor into our political system?

Nyhan: We are coming into contact with lots of studies that are being offered as evidence for various kinds of policies and interventions. And I think learning about the uncertainty and science and the value of high-quality science, as well as the uncertainty inherent in the process, right, and to be able to hold those both in our heads at the same time is a really important part of hopefully being a leader in this country in the 21st century.

Aschwanden: So putting this all together, what does this tell us about science?

Nyhan: It’s an imperfect human process with many flaws. But it’s an incredibly powerful one for learning about the world. It’s the best way we know how to learn about the world. And I think we can keep what’s great about science while also remembering it’s highly provisional in nature. If we can strike that balance, it can do great things.

Aschwanden: Oh, I love your optimism!

Today’s episode of Uncertain is about the ways that studies can leave us overconfident. We just heard about how “just-so stories” can make us feel overly certain about results that are still a work in progress. And now we’ve now arrived at ... Act 2: The problem with scientists.

Aschwanden: Sometimes studies get misleading results because of random error or weird samples or study design.

But sometimes science gets things wrong because it’s done by humans, and humans are fallible and imperfect.

No matter how much we like to think that science is impartial, the truth is: human preferences and values influence how science is done.

And that means cultural factors and embedded prejudices can sway the results that studies produce. Someone who knows a lot about this is Brandon Ogbunu.

Yeah, I know. This episode had a Brendan in our first act; now we’ve got Brandon. Don’t let it confuse you. Brendan is the political scientist. Brandon is a biologist.

Here’s Brandon.

C. Brandon Ogbunu: I’m an assistant professor at department of ecology and evolutionary biology at Yale University and an external professor at the Santa Fe Institute.

Aschwanden: Brandon has given a lot of thought to scientific authority. He grew up with a lot of respect for scientists.

Ogbunu: I was raised with this appreciation for scientific, quote, unquote, “ideas,” and certainly people who did it, and I think that is because of the great things that scientists accomplish, and, you know, going to the moon and drugs for diseases and the engineering breakthroughs of the last century. And so, I had this very, very positive view of science.

Aschwanden: How did this play into your views on uncertainty? Was there a moment in your career or your education where you fully came to grips with the importance of uncertainty to science?

Ogbunu: Well, I’m cheating a bit here, but, I think every time I talk in the media, I have to talk about my mother because she’s the center of gravity for all of me. All of these ideas, like certainty or merit, or all these institutions, to me, never made sense, right?

And it’s because I was raised by an African American woman who is smarter than anybody in my department, now. She just didn’t have the opportunity. So the notion that life gives people a chance based on merit and you just work hard, work hard, and you’ll get with work—that’s never been true ever, right? Because I saw somebody brilliant and hardworking get nothing but barriers put up her whole life.

She knew what she was talking about more than most people. And again, challenges, challenges, challenges, challenges, challenges—I’ve seen her be right 100 times, and still high school administrators or whoever, landlords or whoever we were dealing with at the time be wrong, and they win.

Aschwanden: That’s soul-crushing. How did that influence your view of scientific authority?

Ogbunu: So the notion that, like, that the world is supposed to work a certain way was never right with me, I think for me—and that’s tied to the African American experience in America.

Aschwanden: Right.

Ogbunu: It was, like, all of these voices of authority told us where we were supposed to be in society and what we were supposed to be doing and how we couldn’t go to school. And so that’s never worked. And I think, certainly the gospel of certainty, that’s another kind of authority, right? And I think—so my point is, I’ve never really entered science that way.

Aschwanden: So how do you think of science now?

Ogbunu: I think as I’ve matured, just as a person, and have become a, you know, a member of the scientific workforce, I think both my appreciation for science has grown and become much more nuanced—and I mean my love of it, in a very positive sense, and my understanding of its flaws and its major problems and pitfalls and challenges.

Aschwanden: So you love science despite its shortcomings.

Ogbunu: I appreciate, in many more ways, the power of it and what makes it beautiful. And I understand the challenges and the damaging aspects of, you know, the damage that it can cause.

Aschwanden: What do you mean when you say the damage it can cause?

Ogbunu: On one end, we mean that the products of science can be damaging, right? So, you know, we talk about the bomb; we’re talking about biological weapons.

Aschwanden: Right, the products of science can be very harmful, and in those two cases, they were actually created specifically to cause damage.

Ogbunu: The much more pernicious, but simultaneously more violent way, is the ideas.

Aschwanden: So you mean some of the actual ideas and theories that science puts out? I’m thinking here of eugenics.

Ogbunu: You can’t look at, like, the eugenics movement and say, “Oh, well, that was just a fringe thing.” Like, no, it wasn’t. That was the field. That was the way the mainstream thought.

Aschwanden: For instance, Thomas Jefferson…

Ogbunu: … was the preeminent polymath of his time. He was a technical and evidence-based and profoundly scientific thinker for his time.

Aschwanden: What is the lesson we should take from this?

Ogbunu: When the process for all of the reasons doesn’t have to pay attention to the way social influences or biases or power are influencing the types of things we’re asking and how we’re asking them, science can get things shockingly wrong. It can lead us down really, really bad, incorrect paths that ended up causing harm.

Aschwanden: How does science get these bad ideas to begin with?

Ogbunu: It can be because other interests, biases that you may not see—your own and society’s—infiltrate the process and steer the process in order to try to get to a product that we oftentimes find out was based on a broken idea.

Aschwanden: We think that science is objective, but you’re arguing that in fact, we’re bringing our human biases to bear here. Even the nature of the question we’re asking is not objective, right?

Ogbunu: The term bias doesn’t even necessarily mean bad or harmful. It means that you are a person..., a sentient individual who has experienced things on the Earth.... There is no way that is not going to influence the manner that you do your work, from me writing my manuscripts in English, right? Let’s just start there. Not only does that influence to whom I’m speaking, it influences the shape and the manner that I describe certain ideas because languages have different means of describing things.

Aschwanden: So you’re saying that scientists need to look for and acknowledge the limitations of their work?

Ogbunu: I report the limits of my work, for me. I think that is the responsible way to do science. But that stands out to people. And they’re like, “Wait a minute; what are you doing? Why would you be pointing out the flaws in your own work, Brandon? Is that imposter syndrome?” No, it’s not imposter syndrome. It’s me being a responsible scientist.

Aschwanden: I’ve heard a lot of talk within the scientific community in recent years about making it easier for scientists to correct course or issue retractions without undue penalty. Should we think of being wrong as the occasional price of being ambitious and creative?

Ogbunu: Oftentimes, it’s not about the proportion of times that you’re right. It’s what you are right about.... In fact, many of the people who are right about important things were wrong about many, many, many, many, many, many, many, many, many, many, many, many, many, many other things.

Aschwanden: Ok [laughs] you’re saying there are a lot of cases where scientists have been wrong. And we tend to focus on the correct ideas and forget the ones we abandoned.

Ogbunu: I had this ambition to give a seminar where I talk exclusively about things I was wrong about and why. When I do that, frankly, that will be a sign of my eminence ...

Aschwanden: Because you can boldly profess your wrongness without losing your credibility.

Ogbunu: Admitting that you’re wrong and explaining what you got wrong is actually a demonstration of your power, knowledge and authority, and demonstrating the wrongness and what you’ve learned is precisely what makes science the greatest knowledge-creation instrument in the history of the universe.

Aschwanden: How do you balance wrongness with authority?

Ogbunu: Look, I’m an arrogant professor, like everybody else is. I’m not modest. I think I’m right about things. I’m not saying you should expect to be wrong or try to be wrong. But if you are, just say so and explain to us why you’re wrong and what we can learn because being wrong is a critical part to building the understanding.

Aschwanden: Does the public need a better understanding of the provisional nature of science and the way that being wrong is a natural part of the process?

Ogbunu: I believe it’s the only way science is going to survive as an institution—is if we start to actually lay bare and make transparent all of these dimensions of the way that it works.

Uncertainty is not a sign of a problem. Uncertainty is oftentimes a feature of the way we’re studying the natural world. Uncertainty will be a part of virtually every natural system in the universe. The question is: How do we change that uncertainty? How do we inform the parameters of that uncertainty?

Aschwanden: What’s your parting takeaway? What should the public know about uncertainty and science?

Ogbunu: So the idea is we need to disabuse ourselves of certainty even being a goal. The goal is understanding the world, and understanding can exist with accumulated uncertainties that science is kind of informing and tuning with our data.

Aschwanden: Let go of the need for certainty. That seems like good life advice, too.

[CLIP: Music sting]

Aschwanden: We’ve spent much of today’s episode exploring how easy it is for us to fall a little too in love with our ideas, which is problematic when they turn out to be wrong. And the history of science is full of ideas that were not quite right or even dead wrong.

But science has also illuminated many useful truths about the universe, and there are some things that scientists can do to point themselves in that direction.

Brian Nosek: The way that we phrase it is that the priority for researchers should be on getting it right, not being right. The notion that I have to be right means that I need to take a defensive tact when you find a flaw. The prioritization of getting it right means that if you say: “I think I see something wrong with your research,” I should say: “Really? What is it? Tell me because my goal is to make sure it is right.”

On the next episode of Uncertain, we’ll talk to scientists who are working on being less wrong.

Uncertain is produced by me, Christie Aschwanden, and Jeff DelViscio. Our series art is by Annaissa Ruiz Tejada. Our music is from Epidemic Sound.

Funding for this series was provided by UC Berkeley’s Greater Good Science Center–it’s part of the Expanding Awareness of the Science of Intellectual Humility Initiative, which is supported by the John Templeton Foundation.

This is Uncertain, a podcast from Scientific American. Thanks for listening.

Listen to the second episode of Uncertain.

Listen to the fourth episode of Uncertain.

Overconfidence Can Blindside Science and Society Alike. Here's How Not to Get Fooled