Months before pharmaceutical companies had even announced they would be releasing a COVID-19 vaccine, many psychological scientists were preparing for the onslaught of vaccine falsehoods that they knew would emerge on the internet.
Already experimenting with a novel method to help people distinguish truth from lies—mainly political and scientific in nature—two researchers began working with global health authorities to prepare for war against anti-vaccine campaigns.
Their invention, an online game called Go Viral!, is the latest initiative to effectively inoculate people against misinformation. Go Viral! challenges players to create deceptive social media posts related to COVID-19. The aim is to help players recognize the tactics that extremists and fraudsters use to mislead people. It is, in essence, a vaccine against viral misinformation. With promotional help from the United Nations and the World Health Organization (WHO), the game creators have drawn more than 1.5 million players since launching it in October 2020.
“We know fake news spreads like a virus on the internet,” said University of Cambridge social psychologist Sander van der Linden, a cocreator of Go Viral!, in an interview with the writer. “So we want this psychological vaccine to spread like that, too.”
Van der Linden is among a group of social scientists testing creative ways to bridle the pandemic of misconceptions, lies, and conspiracy theories. Their work escalated a few years ago, when policymakers began blaming “fake news” on Facebook for swaying election results.
But now their research is a matter of life and death. Social media gadflies and vaccine skeptics have painted COVID-19 vaccines as a trigger for infertility and miscarriage, a conspiracy to wipe out Black people, and billionaire philanthropist Bill Gates’s secret plan to inject tracking chips into our bodies. The result has been a persistent tide of vaccine avoidance, severe illness, and death.
Rise of the resistance
Optional vaccines have historically generated low uptake rates. Fewer than half of American adults, for example, receive a seasonal flu shot, according to the U.S. Centers for Disease Control. In Europe, flu vaccination rates among the elderly—the most vulnerable population—average about 45% (Rizzo et al., 2018). Fewer than 60% of American adolescents are fully immunized against the sexually transmitted human papillomavirus (Pingali et al., 2021).
But until recently, vehement resistance to vaccines remained relatively limited. A 2018 research review in Psychological Science in the Public Interest showed the number of people who actively refused vaccines to be small and static. APS Fellows Noel T. Brewer (University of North Carolina), Gretchen B. Chapman (Carnegie Mellon University), and their coauthors said that failure to vaccinate stemmed more from neglectfulness than outright resistance. They cited behavioral nudges, such as reminders sent from the doctor’s office, as the best way to get people to follow through (Brewer et al., 2017).
But deepening political polarization, growing mistrust of government, and weak health policy combined to change attitudes about immunization as the pandemic hit, Chapman said.
“Something that’s gone wrong for many decades is lack of prioritizing the public health system,” she explained in an interview. “It was under resourced, with no long-term strategy for how to combat a pandemic when it comes. And as a result, there’s not very high trust in the public health system once something does happen.”
Add to that the seemingly conflicting messages that emerged as scientists scrambled to understand a novel contagion, with people left to choose their preferred sources of information, Chapman said. Unfortunately, many of those preferred sources are divorced from science.
From the start of the COVID pandemic, false and misleading health information has flooded social media platforms and politically aligned cable news channels. A report published in BMJ Global Health in the pandemic’s infancy concluded that falsehoods about the disease had reached far more people than in any previous public health crisis (Li et al., 2020).
APS Fellow Dan Romer, research director at the University of Pennsylvania (UPenn) Annenberg Public Policy Center, is among the social scientists who have illuminated misinformation’s impact on health behaviors. In analyzing two waves of survey data on 840 people, Romer and his UPenn colleague Kathleen Hall Jamieson found that, by July 2020, 38% of respondents believed that China created the coronavirus as a bioweapon, 24% believed that U.S. health authorities exaggerated the virus’s danger to discredit then-President Donald Trump, and 17% believed that pharmaceutical companies invented the virus to boost sales of drugs and vaccines. The researchers also found that believers in those theories showed more reluctance to adopt preventative behaviors, such as wearing a mask or getting a vaccine (Romer & Jamieson, 2021b).
Reasons for rejecting facts
Scholars such as APS Fellows Stephan Lewandowsky (University of Bristol) and Karen Douglas (University of Kent) have built profiles of the science skeptics who question climate change, reject the theory of evolution, and believe childhood vaccines cause autism. They point to several psychological motivations for rejecting scientific consensus: The science may threaten skeptics’ ideologies, comfort, sense of safety, or identity. (See this article online for links to interviews with Douglas, including this profile from January.)
“Rather than behaving like cognitive scientists—examining evidence evenly with a goal to obtaining the most accurate approximation of objective reality—people sometimes behave like cognitive lawyers, appraising evidence in a biased way with a view to reaching a preferred outcome,” Matthew J. Hornsey (University of Queensland) wrote in a 2020 article for Current Directions in Psychological Science. “From this perspective, the question is not ‘Why would people reject the science?’ but rather, ‘Why would people want to reject the science?’”
Researchers have also found that people resist efforts to correct their false beliefs. What’s more, skepticism toward science has proved remarkably unmovable, as a group of researchers reported in a review published in January. Research suggests that misinformation can continue to sway people’s thinking even after they accept a correction as true—a phenomenon known as the continued influence effect (Ecker et al., 2022).
Scientists are now experimenting with tools designed to help people consider tweets, posts, and headlines before accepting them as factual and sharing them with others.
Among the pioneers in this area are psychological scientist Gordon Pennycook at the University of Regina, Canada, and management scientist David Rand at the Massachusetts Institute of Technology. In two studies that included more than 1,700 participants, Pennycook, Rand, and colleagues acquired a list of 15 false and 15 true headlines related to COVID-19 and presented them to participants as Facebook posts. They found that asking participants to rate the accuracy of the information could make them think twice about what they share on social media (Pennycook et al., 2020).
A preregistered replication of that research showed the accuracy nudge to have weaker effects than what the original study showed (Roozenbeek et al., 2021). But Pennycook, Rand, and colleagues have expanded their tool of accuracy prompts and are testing them across a variety of demographic groups (Epstein et al., 2021).
Prebunking through gaming
Other scientists have been experimenting with the allure of online games and apps to test what they call the “prebunking” approach, borrowing from Louis Pasteur’s principle of vaccination. Just as vaccinations introduce weakened, harmless versions of pathogens to stimulate the body’s immunity against them, these games present people with small doses of disinformation to help them better spot it and avoid sharing it. In 2020, psychological scientist John Cook, an assistant professor at George Mason University and founder of the nonprofit organization Skeptical Science, garnered widespread attention with his game Cranky Uncle, designed to inoculate players against climate misinformation. Players take on the role of a cantankerous science-denying uncle and are exposed to some of the flawed logic used to reject scientific consensus. The United Nations Children’s Fund recently sought Cook’s expertise to help develop the Vaccine Misinformation Management Field Guide. That document aims to help countries develop national action plans to counter falsehoods and build demand for immunizations.
Behavioral Science Boosts Vaccination Uptake Among Long-Term Care Staff
A report developed by a team of psychological scientists, including late APS Fellow Sigal Barsade, helped boost COVID-19 vaccination uptake among employees of U.S. long-term care facilities. The COVID-19 Vaccination Uptake Behavioral Science Task Force Final Report, released in February 2021, recommended focusing efforts on the “movable middle,” among other things, by making vaccination easy, building trust in vaccine safety, and offering incentives for vaccination.
The report was born out of concern over a staff vaccination rate of just 37.5% during initial on-site vaccination clinics at assisted living communities and nursing homes through the Centers for Disease Control and Prevention’s Pharmacy Partnership for Long-Term Care Program. “It is clear that there is no one size fits all solution for addressing the issue of vaccine hesitancy,” the report noted. “Rather, interventions must be multiple, layered and deployed at a local level, taking into consideration the context of the organization and its employees.”
Nationally, the nursing home staff vaccination rate rose to about 84% by January 30, 2022, according to data reported by the Kaiser Family Foundation.
The report was presented to Lee Fleisher, chief medical officer at the Centers for Medicare & Medicaid Services and professor of anesthesiology and medicine at the University of Pennsylvania’s Perelman School of Medicine. In an email to APS, Fleisher described the report as “very helpful in informing our strategies for messaging to nursing home staff and residents.”
Barsade, who researched organizational culture at the Wharton School, the business school of the University of Pennsylvania, died of glioblastoma on February 6, 2022, at the age of 56. A pioneer in the study of how emotions shape workplace culture and affect organizational performance, she spoke at the 2017 APS Annual Convention in a presentation on emotions in the workplace and contributed articles to APS journals including Current Directions in Psychological Science (“Group Affect: Its Influence on Individual and Group Outcomes,” 2012) and Psychological Science (“Debiasing the Mind Through Meditation: Mindfulness and the Sunk-Cost Bias,” 2013).
Other APS Fellows who contributed to the report include Angela Duckworth (a co-chair on the task force, along with Barsade), Gretchen Chapman, Robert Cialdini, Tom Gilovich, Adam Grant, and Barry Staw.
Van der Linden and his colleague Jon Roozenbeek at Cambridge have worked closely with the Dutch media company DROG to create other inoculation-type games. In 2019, they released Bad News, a game in which players earn badges for using such disinformation tactics as discrediting experts, arousing anger and fear, and touting conspiracy theories. Exposure to these strategies helps people better spot and avoid sharing the fake news they encounter on social media, the researchers say.
Van der Linden and Roozenbeek have closely tracked Bad News’s effectiveness. In one experiment involving more than 14,000 players, participants’ trust in the fake information fell by an average of 21% after completing the game. The effects held across age, education, gender, and political persuasion. Those who were most susceptible to falsehoods before playing the game benefited most from the “inoculation” (Roozenbeek & van der Linden, 2019).
Publicity about the game sent it spreading to millions of players around the world. The promising results earned Bad News the 2020 Brouwers Trust Prize from the Royal Holland Society of Sciences and the University of Florida’s Prize in Public Interest Communications. The U.K. government has translated the game into 20 languages in partnership with media literacy organizations around the world.
A year after the release of Bad News, the team rolled out Harmony Square, an online game in which players are recruited as a “chief disinformation officer” and must use tactics such as inflammatory social media posts to sabotage elections in a peaceful town. It also has been shown to reduce susceptibility to political misinformation in its users (Roozenbeek & van der Linden, 2020).
Now van der Linden and Roozenbeek are focused squarely on COVID-19 fictions. For Go Viral!, they’ve collaborated with the U.K. Cabinet Office and Stop the Spread, WHO’s campaign against coronavirus fallacies. The free online game exposes players to three manipulation techniques commonly used in COVID-19 misinformation: emotionally charged language, testimony from fake experts, and conspiracy theories.
The researchers conducted two large-sample studies to test the game’s efficacy. In the first study, 1,771 participants filled out surveys both before and after playing the game. Players rated the manipulativeness of tweets about COVID-19 (half of which contained false information and half of which were valid) on a scale from 1 to 7. Players rated misinformation tweets as significantly more manipulative after playing the game. Meanwhile, their ratings for the factual tweets remained steady before and after playing.
The second study was a preregistered randomized controlled trial, run in three languages: English (with participants from the United Kingdom), French, and German. The scientists recruited participants using the crowdsourcing research tool Prolific Academic. Some of the participants played Go Viral!. Another group viewed a set of infographics from #ThinkBeforeSharing, a campaign developed by the United Nations Educational, Scientific, and Cultural Organization to battle COVID disinformation. A third group, the control group, played the popular game Tetris.
The researchers followed up a week later with some of the participants to see if the “inoculation” effect persisted. They found that playing Go Viral! and reading the infographics significantly improved people’s ability to detect manipulative content about COVID-19 as well as their confidence in doing so. For the Go Viral! game (but not for the infographics), these effects persisted for at least a week. Playing Go Viral! also reduced participants’ self-reported willingness to share COVID-19 misinformation with people in their network (Basol et al., 2021).
At APS 2022: More on the Science Behind COVID-19 and Vaccination Uptake
The U.S. Centers for Disease Control and Prevention (CDC) organized a 3-hour panel and workshop on human behavior and promoting vaccination uptake at the 2022 APS Annual Convention in Chicago. Speakers included Neetu Abad, Elisabeth Wilhelm, and Chris Voegeli from CDC, Noel Brewer (UNC-Chapel Hill), Gretchen Chapman (Carnegie Mellon University), and William Klein (U.S. National Cancer Institute). Following the panel, participants engaged in a simulation to translate psychological science to pandemic response.
Two other convention events were linked with APS’s Global Collaboration on COVID-19: “COVID-19 and the Workplace: Interdisciplinary Insights on the Shecession, Essential Workers, and the New Normal,” a symposium organized by Adrienne R. Carter-Sowell (University of Oklahoma), and a special address titled “Understanding Processes of Communication Leading to Behavioral Change: The Case of COVID-19 and Other Infections in the United States” by Dolores Albarracín (University of Pennsylvania).
But not everyone can be reached with an online game, Romer said. He and Jamieson have found in their research that people drawn to conservative media are vulnerable to conspiratorial thinking and resistant to pandemic prevention measures (Romer & Jamieson, 2021a).
“People with those conspiratorial thinking tendencies are also immersed in a media environment in which alternative views are simply not entertained—and if they are, they are debunked as fake news,” Romer told the Observer. “So, the problem is not simply one of correcting misinformation or overcoming the echo chamber that believers in conspiracy theories are enmeshed in. Simply asking people to play games is not going to work for those folks.”
Van der Linden acknowledged that the apps only reach people who enjoy playing games. But his team has been experimenting with alternative prebunking approaches, including collaborating with Google to create some prebunking videos in the form of YouTube advertisements. Initial tests of those ads have shown them to be effective in helping viewers learn to spot manipulative content (Roozenbeek & van der Linden, 2021).
“That’s the latest way in which we’re creating a multitude of ‘virtual needles’ to increase uptake and hopefully reach herd immunity someday,” he said.
Scott Sleek is a freelance writer in Silver Spring, Maryland.
This article was initially published in the print edition of the May/June 2022 Observer under the title, “Vaccinating Against Bunk.”
Feedback on this article? Email email@example.com or scroll down to comment.
Related research >
Basol, M., Roozenbeek, J., Berriche, M., Uenal, F., McClanahan, W. P., & van der Linden, S. (2021). Toward psychological herd immunity: Cross-cultural evidence for two prebunking interventions against COVID-19 misinformation. Big Data & Society, 8(1).
Brewer, N. T., Chapman, G. B., Rothman, A. J., Leask, J., & Kempe, A. (2017). Increasing vaccination: Putting psychological science into action. Psychological Science in the Public Interest, 18(3),149–207. https://doi.org/10.1177/1529100618760521
Ecker, U. K. H, Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., Kendeou, P., Vraga, E. K., & Amazeen, M. A. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1, 13–29. https://doi.org/10.1038/s44159-021-00006-y
Epstein Z., Berinsky A. J., Cole, R., Gully, A., Pennycook, G., & Rand, D. G. (2021). Developing an accuracy-prompt toolkit to reduce COVID-19 misinformation online. Harvard Kennedy School Misinformation Review, 2(3). https://doi.org/10.37016/mr-2020-71
Hornsey, M. J. (2020). Why facts are not enough: Understanding and managing the motivated rejection of science. Current Directions in Psychological Science, 29(6), 583–591. https://doi.org/10.1177/0963721420969364
Li, H. O., Bailey, A., Huynh, D., Chan, J. (2020). YouTube as a source of information on COVID-19: A pandemic of misinformation? BMJ Global Health, 5(5), e002604. https://doi.org/10.1136/bmjgh-2020-002604
Pennycook, G., McPhetres, J., Zhang, Y., Lu, J. G., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychological Science, 31(7), 770–780. https://doi.org/10.1177/0956797620939054
Pingali, C., Yankey, D., Elam-Evans, D., Elam-Evans, L.D., Markowitz, L.E., Williams, C.L., Fredua, B., McNamara, L.A., Stokley, S., & Singleton, J.A. (2021). National, regional, state, and selected local area vaccination coverage among adolescents aged 13–17 years. Morbidity and Mortality Weekly Report, 70(35), 1183–1190. http://dx.doi.org/10.15585/mmwr.mm7035a1
Rizzo, C., Rezza, G., & Ricciardi, W. (2018). Strategies in recommending influenza vaccination in Europe and US. Human Vaccines & Immunotherapeutics, 14(3), 693–698. https://doi.org/10.1080/21645515.2017.1367463
Romer, D., & Jamieson, K.H. (2021a). Conspiratorial thinking, selective exposure to conservative media, and response to COVID in the US. Social Science & Medicine, 291, https://doi.org/10.1016/j.socscimed.2021.114480.
Romer, D., & Jamieson, K. H. (2021b). Patterns of media use, strength of belief in COVID-19 conspiracy theories, and the prevention of COVID-19 from March to July 2020 in the United States: Survey study. Journal of Medical Internet Research, 23(4), https://doi.org/10.2196/25215
Roozenbeek, J., Freeman, A. L. J., & van der Linden, S. (2021). How accurate are accuracy-nudge interventions? A preregistered direct replication of Pennycook et al. Psychological Science, 32(7), 1169–1178. https://doi.org/10.1177/09567976211024535
Roozenbeek, J., & van der Linden, S. (2019). Fake news game confers psychological resistance against online misinformation. Nature: Humanities & Social Sciences Communications, 5, Article 65. https://doi.org/10.1057/s41599-019-0279-9
Roozenbeek, J., & van der Linden, S. (2020). Breaking Harmony Square: A game that “inoculates” against political misinformation. Harvard Kennedy School Misinformation Review, 1(8). https://doi.org/10.37016/mr-2020-47
Roozenbeek, J., & van der Linden, S. (2021). Inoculation theory and misinformation. NATO Strategic Communications Centre of Excellence. https://stratcomcoe.org/publications/inoculation-theory-and-misinformation/217