Hand with marker writing the word Facts Myths

Counterarguments Are Critical to Debunking Misinformation

It’s no use simply telling people they have their facts wrong. To be more effective at correcting misinformation in news accounts and intentionally misleading “fake news,” you need to provide a detailed counter-message with new information – and get your audience to help develop a new narrative. Those are some takeaways from an extensive new meta-analysis of debunking studies published in the journal Psychological Science, a journal of the Association for Psychological Science.

The analysis, the first conducted with this collection of debunking data, finds that a detailed counter-message is better at persuading people to change their minds than merely labeling misinformation as wrong. But even after a detailed debunking, misinformation still can be hard to eliminate, the study finds.

“The effect of misinformation is very strong,” said co-author Dolores Albarracín, professor of psychology at the University of Illinois at Urbana-Champaign. “When you present it, people buy it. But we also asked whether we are able to correct for misinformation. Generally, some degree of correction is possible but it’s very difficult to completely correct.”

The study was conducted by researchers at the Social Action Lab at the University of Illinois at Urbana-Champaign and at the Annenberg Public Policy Center of the University of Pennsylvania. In total, they examined 20 experiments in eight research reports involving 6,878 participants and 52 independent samples.

The analyzed studies, published from 1994 to 2015, focused on false social and political news accounts, including misinformation in reports of robberies; investigations of a warehouse fire and traffic accident; the supposed existence of “death panels” in the 2010 Affordable Care Act; positions of political candidates on Medicaid; and a report on whether a candidate had received donations from a convicted felon.

The researchers coded and analyzed the results of the experiments across the different studies and measured the effect of presenting misinformation, the effect of debunking, and the persistence of misinformation.

“This analysis provides evidence of the value of the extended correction of misinformation,” said co-author Kathleen Hall Jamieson, director of the Annenberg Public Policy Center (APPC) and co-founder of its project FactCheck.org, which aims to reduce the level of deception in politics and science. “Simply stating that something is false or providing a brief explanation is largely ineffective.”

The lead author, Man-pui Sally Chan, a research assistant professor in psychology at the University of Illinois at Urbana-Champaign, said the study found that “the more detailed the debunking message, the higher the debunking effect. But misinformation can’t easily be undone by debunking. The formula that undercuts the persistence of misinformation seems to be in the audience.”

As the researchers reported: “A detailed debunking message correlated positively with the debunking effect. Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect.”

However, Albarracín said the analysis also showed that debunking is more effective – and misinformation is less persistent – when an audience develops an explanation for the corrected information.

“What is successful is eliciting ways for the audience to counterargue and think of reasons why the initial information was incorrect,” she said.

For news outlets, involving an audience in correcting information could mean encouraging commentary, asking questions, or offering moderated reader chats – in short, mechanisms to promote thoughtful participation.

The researchers made three recommendations for debunking misinformation:

  • Reduce arguments that support misinformation: News accounts about misinformation should not inadvertently repeat or belabor “detailed thoughts in support of the misinformation.”
  • Engage audiences in scrutiny and counterarguing of information: Educational institutions should promote a state of healthy skepticism. When trying to correct misinformation, it is beneficial to have the audience involved in generating counterarguments.
  • Introduce new information as part of the debunking message: People are less likely to accept debunking when the initial message is just labeled as wrong rather than countered with new evidence.

The authors also included Christopher R. Jones, a former postdoctoral fellow at APPC and at the University of Illinois.

All data have been made publicly available via the Open Science Framework. The complete Open Practices Disclosure for this article is available online. This article has received the badge for Open Data.

Research reported in this article was supported by the National Cancer Institute of the National Institutes of Health (NIH) and Food and Drug Administration (FDA) Center for Tobacco Products (Award No. P50CA179546). The content is solely the responsibility of the authors and does not necessarily reflect the official views of the NIH or the FDA.

Comments

Interesting. Provide “a detailed counter-message” while omitting “detailed thoughts in support of the misinformation.” In other words, make a one-sided argument. Sounds more like an age-old propaganda tactic than the result of scientific research.

I wonder how much taxpayer funding went into this?

I think that perhaps you misunderstand the point. The danger in overemphasizing the misinformation, which evidence clearly demonstrates to be misinformation despite popular beliefs, is that multiple studies consistently showed that people stop further encoding once they find a “familiar” statement. Rather than change their misconception, people end up believing more strongly in the misinformation. Thus, the advice is not to give a 1-sided message at all! You *want* to activate the misinformation but you want to minimize the potential for a backfire effect and want to maximize that people will read on to the evidence that favors the correct conclusion based on the evidence. If there is a propaganda issue at all it is by people promoting one-sided arguments with NO MENTION of the alternative information, for the misinformation.

Although I do not agree with your snarky comment regarding funding (I believe this topic is hugely important and the funding should be increased) I do agree with your chief comment. Yes, this does seem to be a poor solution to the problem. I am facing this issue in my classroom and when I try to get the proponents of misinformation to see the fallacies they are committing or ask them for the evidence of their claims, I believe I come across as prejudiced as to the “truth”.

As an addendum: I have done much work in the area of reducing psychological misinformation and misconceptions. I have received ZERO funding and have spent every free day (all summers and winter breaks and working 60+ hours per week year round) working on this. In addition, I have made ZERO money with publications. Everything is in professional journals which pay nothing to their authors. I do believe that idea that research is tied to money is mostly limited to a relatively small percentage of academic work.

Well said, Annette; and, ‘more power to your elbow’…. 🙂

As a public servant, I’ve seen less acceptance of facts thst “government” brings to the debate. In particular is the emotional draw of lived experience vs. data. Is this a bridgeable divide?

I wonder if one could bring about the idea of convergence of perspective of people with opposing polarity of ideological divide. I believe the research is based much on ‘Replacing Fake with Real news/Information’. But then much of the conflict is about ‘Mistrust’. Its My Truth Versus Your Truth. That is where, i guess we need to do much more to bring in more secular common ground that could co-accommodate the ‘Truths’.

Leave a Comment

Your email address will not be published.
In the interest of transparency, we do not accept anonymous comments.
Required fields are marked*