The resource draws from research published in APS journals Psychological Science, Current Directions in Psychological Science, and Psychological Science in the Public Interest, among other sources, along with interviews with several psychological scientists.
The global impact of misinformation has reached damaging new heights, fueled in recent years by a volatile mix of ingredients that are of growing concern to psychological scientists. These include win-at-all-cost tactics in partisan politics, digital media’s engagement-driven newsfeed algorithms, and significant public crises—such as COVID-19 and its long tail of human and economic harm—that have led to as much clashing over “individual liberties” as uniting around collective challenges. Although some spreaders of misinformation are merely uninformed, others are state-sponsored and chilling in their use of next-level tactics, such as the Russian government’s dissemination of faked videos billed as “fact-checks” of supposed falsehoods in its brutal war against Ukraine, along with effectively criminalizing accurate reporting on the war.
A new APS white paper, Countering Misinformation With Psychological Science, attempts to slow misinformation’s spread by providing science communicators and policymakers with the knowledge to effectively debunk it, thus improving ordinary individuals’ ability to reconsider and reject it—ideally, well before it takes root. Published in April, this resource brings together the best and most recent psychological science research on misinformation published in the APS journals Psychological Science, Psychological Science in the Public Interest, and Current Directions in Psychological Science, along with other sources.
Online environments are often designed to discourage users from considering the quality of the information they consume and share. Misinformation that feeds into our cognitive blind spots and biases can be tricky to undo, but psychological science provides a number of evidence-based interventions. In The Demon-Haunted World, famed astronomer and author Carl Sagan outlined what he called a “baloney detection kit,” simple steps that people can follow when evaluating claims. In the same spirit, APS hopes that this “misinformation prevention kit” will help communicators’ messages rise above the din, foster critical thinking, and truthfully inform the public.
A review of the state of modern misinformation and its impact on vaccine hesitancy and resistance
- Rumors, conspiracy theories, and deceptive propaganda have long presented twisted views of reality to the public. The speed and spread of modern misinformation have significantly weakened the public’s ability to tell truth from fiction and act accordingly.
- Social media platforms and search engines are for-profit enterprises, primarily designed to create revenue rather than to inform the public. To maximize profits, online platforms manipulate audiences into continuously engaging with content without regard for the accuracy of the content being consumed.
- Online environments have amplified harmful behaviors that already existed in media and human psychology more broadly. Preliminary news reports generated by the 24-hour news cycle, which are often incomplete and unverified, can spread rapidly online and become fixed in the public imagination, rendering them resistant to correction. One-sided “echo chambers” are created by the content-recommendation algorithms common on social media platforms combined with users’ preferences to engage with people who share their beliefs.
- In self-contained ecosystems of information, even highly uncommon beliefs are presented without opposition, leading to the perception that a sizable number of other people share the same views. This further enables the spread of misinformation through the development of “multiple realities” in which distortions that cater to people’s wishful thinking and political biases may be accepted as objective truth on the basis of little or no evidence.
- Political forces may knowingly exploit psychological blind spots to intentionally sow disinformation, a form of misinformation that is created with malicious intent to harm an individual or group, or to sway public opinion toward a particular political ideology.
- Whether drawn from misinformation or disinformation, inaccuracies accepted as facts can be staggeringly harmful at both the individual and the societal level, pushing people to become even more politically polarized, inspiring conspiratorial thinking, and casting doubt on scientific consensus.
- Vaccine misinformation is not a new phenomenon. From polio to smallpox, as new vaccines have emerged, so too have anti-vax voices. But countering this historic trend has taken on new urgency as anti-vaccination activists have attempted to undermine acceptance of vaccines and vaccine mandates throughout the COVID-19 pandemic. The persistence of modern anti-vaccination misinformation demonstrates how resistant it can be to correction.
Programs on Misinformation at APS 2022
Psychological scientists will address various issues related to misinformation in symposia, flash talks, and poster sessions at the 2022 APS Annual Convention in Chicago May 26–29. Sessions will cover factors associated with the tendency to spread misinformation on social media, fraud beliefs following the 2020 U.S. election, how magicians use misinformation to create false memories, relations between intellectual humility and misinformation, and much more. Learn more and register here.
Factors that can make people vulnerable to misinformation
- Societal factors include the structure of online environments, which encourage users to consume and share content. The most shareable media often capture attention by focusing on negative emotions such as fear, disgust, and moral outrage. Without knowledgeable gatekeepers, opinion can appear as fact, and misinformation can be both monetized and weaponized. Misinformation is also “sticky,” which limits the effectiveness of retractions and other attempts to debunk false information once a person has accepted it as accurate.
- Individual factors include age (users ages 65 and older view and share the majority of misleading “fake news” on social media platforms like Facebook and Twitter). Across demographics, humans tend to assume that information is accurate, to overestimate our own knowledge, and to prioritize information about potential hazards. One’s worldview can play a role as well. For instance, people with “science-skeptical” attitudes are more motivated to reject scientific consensus, such as the existence of human-caused climate change, on ideological grounds.
- Policymakers could help create an “internet for citizens” by regulating online environments in ways that empower users to control their digital lives, maintain privacy, and more easily determine the quality of sources.
- “Prebunking” can help prevent false beliefs from taking root in the first place by “inoculating” an audience against misinformation before exposure. Forms of prebunking include:
- Tagging content with warnings that may contain misinformation
- Promoting skeptical consumption of media by educating the public about the prevalence and effects of misinformation, as well as how to evaluate the accuracy of sources more effectively
- Informing the public about manipulative strategies used to spread political disinformation
- Debunking inaccuracies once an audience has accepted them as fact is difficult, but these steps can help:
- Retractions should be straightforward and focused and provide audiences with a new narrative of the events in question to supplant their existing beliefs.
- Retractions should be repeated to counteract the familiarity effect. Information that is more familiar feels more true.
- Messages should be targeted to specific audiences. Instead of challenging their worldview or other beliefs, corrections should aim to present the new narrative in a narrowly focused way that affirms the audiences’ existing values.
Targeting behavior directly can be particularly effective when the desired behavioral change conflicts with an audience’s worldview, because it doesn’t require them to alter beliefs that misinformation may have shaped. “Choice architecture” uses psychological science to design the presentation of options or choices in ways that make certain behaviors easier to follow through on. When using these interventions to counter misinformation, researchers advise transparency to enable the audiences to remain active, autonomous participants in their own education and decision-making. Strategies include the following:
Technocognition uses cognitive science to introduce additional “friction” to online environments, intentionally slowing down the rate of communication so that people have time to consider the quality of information that they are sharing.
Nudges can range from passive design choices, like putting graphic warnings on cigarette packages, to proactive public-education campaigns. Nudges are generally built into the environment in which a decision will occur, automatically exposing the target audience to the intervention. As one example, simple reminders about the need to consider the accuracy of headlines before sharing content online have been shown to reduce the spread of misinformation.
“Boosts” can encourage the public to become “choice architects” of their own environment by equipping themselves with the knowledge and skills necessary to make more reasonable decisions about which information to trust and share. General skill building, such as education on how to accurately interpret statistics, can also help reduce susceptibility to misinformation across domains.
This article was initially published in the print edition of the May/June 2022 Observer under the title, “Above the Din.”
Feedback on this article? Email firstname.lastname@example.org or scroll down to comment.