Stephan Lewandowsky is a cognitive psychologist at the University of Western Australia. His research investigates memory and decision making, focusing on how people update information in memory.
We asked Stephan Lewandowsky questions based on his recent paper on misinformation, published in the December issue of Psychological Science in the Public Interest.
The report, “Misinformation and Its Correction: Continued Influence and Successful Debiasing”, is co-authored by Ullrich Ecker of the University of Western Australia, Colleen Seifert and Norbert Schwarz of the University of Michigan, and John Cook of the University of Queensland.
Below is part 2 of 2:
Do mainstream media outlets care about retracting misinformation?
In my experience, sadly, not always. Some media outlets are better than others, but in my experience some media outlets act quite irresponsibly with far-reaching consequences: There is fairly good data to suggest that, overall, viewers of Fox News are the most misinformed across a range of crucial issues whereas listeners of National Public Radio are the least misinformed.
Do you think campaign finance reform laws would decrease political misinformation?
I think political misinformation exists and is disseminated for political reasons. Now, campaign finance reform would likely change the dynamics of politics because politicians would be less likely to be beholden to special interests. However, politicians would still be politicians and thus there would still be an incentive to “spin” information, so the character of the misinformation might change but I am not sure campaign finance reform would abolish it.
Is there a correlation between misinformation and education?
Not necessarily. In fact, when it comes to global-warming misinformation, there are data to suggest that education can have an ironic effect. Specifically, for Republicans, increasing education translates into a decreasing concern with climate change and a greater willingness to accept misinformation over the true state of the science—so worldview trumps facts, and education can increase that disparity.
Are people willing to check the facts on their own political party? Or do people want to hear what they want to hear?
There is evidence that people are quite prone uncritically to accept information from sources sympathetic to their worldviews or political leanings. However, this is not universally the case and some research suggests that facts do matter—although the particular work I am thinking of was done in Canada, and Canadian society is less polarized and less dominated by ideology perhaps than the United States.
Is it possible to change people’s beliefs when they’re so loyal to their political affiliations?
It’s tough but sometimes it can be done. For example, people can be receptive to corrective information after they’ve had an opportunity to affirm their worldviews: If people are given an opportunity to explain their basic values (e.g., caring for others, or free enterprise, or whatever) and report a situation when they felt really good about exercising those values, then they are less likely to block out corrective information.
Can we ever fully retract previous misinformation? Is there a certain amount of repetition that needs to take place in order to retract misinformation?
Repetition is the key to disseminating information, which is why politicians tend to stick to a few talking points. The same is true for corrections; the more often something is corrected, the more likely the correction is to “stick.” In addition, to really eradicate misinformation it is important to provide people with an alternative explanation for why the information was false. For example, people rely on tainted evidence in court even when admonished by the judge to disregard it, unless they are told that the evidence was planted by a corrupt prosecutor or cop or whatever—you have to explain why the initial false information was disseminated in the first place to enable people to let go of it.
View Part 1