Stephan Lewandowsky is a cognitive psychologist at the University of Western Australia. His research investigates memory and decision making, focusing on how people update information in memory.
We asked Stephan Lewandowsky questions based on his recent paper on misinformation, published in the December issue of Psychological Science in the Public Interest.
The report, “Misinformation and Its Correction: Continued Influence and Successful Debiasing”, is co-authored by Ullrich Ecker of the University of Western Australia, Colleen Seifert and Norbert Schwarz of the University of Michigan, and John Cook of the University of Queensland.
Below is Part 1 of 2:
Your paper indicates that social networking is a contributor to misinformation. Do you think that social media can also act to counter misinformation?
In principle, yes. And indeed there are some terrific science blogs with large numbers of twitter followers (e.g., skepticalscience.com) that have made it their mission to combat misinformation in specific arenas, such as climate science.
Do you think that social media can play a role in fact checking?
Perhaps. The biggest issue in all of the new media is the issue of trust: Whom does one trust, and why? And how can it be ensured that people trust reliable sites rather than those that, for one reason or another, disseminate information that is incorrect or misleading? We are only just beginning to understand these processes, and a lot of research remains to be done before we can answer this question.
How can people educate themselves to know what is real and what is misinformation? What are some reliable fact-checking sources?
When it comes to scientific or medical information, the easiest way is to use Google Scholar, rather than plain Google, because Google Scholar filters out most sources other than the scientific literature. So the information you get from Google Scholar is far more reliable. In the political arena, things are more difficult, although some websites dedicated to fact checking (e.g.,http://www.politifact.com/) can help readers decide what’s true and what isn’t.
Do you think that some of us are just too entrenched in our own political and social belief systems to ever acknowledge that we possess misinformation? If so, is there any way such beliefs can be changed?
This is an interesting question: On the one hand, most American voters know that they are being misinformed by politicians, and they deplore that. On the other hand, people have great difficulty differentiating true information from information that is false or misleading. Moreover, when attempts are made to correct this information, people’s false beliefs sometimes become even more entrenched. This is known as a “backfire” effect, and it can arise when people are presented with information that challenges their worldviews.
Does the messenger play a role in whether something “sticks”?
Absolutely. People are far more responsive to messengers who share their cultural values—so Conservatives are more likely to be convinced by fellow Conservatives and Liberals by Liberals, and so on.
Do you think brands (like in the case of Listerine) should be legally made to put out continuous messages to retract their misinformation?
Yes. If advertisers make seriously misleading claims, they should be corrected.
View part 2