Government Relations

Science in Service: Providing Behavioral Advice During a Pandemic

Science in Service highlights psychological scientists who work in government or apply their research to policymaking. Would you be a good fit for this column? Write [email protected].

On his last day in office, Francis Collins, the retiring director of the U.S. National Institutes of Health, said that the United States had “underinvested in research on human behavior” in relation to the COVID-19 pandemic. For many social scientists, it was obvious from the outset of the pandemic that social science would, in some respects, be as important for the outcome of the pandemic as medical sciences have been. Reducing infections requires rapid behavioral change and, hence, efficient communication; increasing vaccinations requires combating vaccine hesitancy and misinformation; and managing tensions during a massive crisis requires dealing with fatigue, polarization, and discontent.   

At the same time, Collins’s remarks sparked debate among social scientists about whether their sciences really had anything to offer. This was reminiscent of exchanges early in the pandemic, when some researchers pointed to literature within the social sciences as relevant to the pandemic while others argued that the state of social scientific knowledge was too uncertain and ridden with replicability issues to be useful for decision-makers.   

I am a Danish political science professor, and the pandemic turned my professional life upside down. In March 2020, I took on the task of providing scientific advice to Danish decision-makers regarding the behavioral aspects of the pandemic, initially on an ad hoc basis and later as member of the Danish government and health authority’s key advisory scientific groups. Further, I have been directing the largest Danish research project on the societal aspects of the pandemic, which also made me one of the most frequently consulted experts in Danish media on the pandemic. Between March 2020 and December 2021, I had more than 2,000 unique media appearances.  

So, for better or worse, I have had to balance the uncertainty inherent in many social science studies and decision-makers’ need to act with my clear obligation—as a publicly funded researcher—to contribute to both decision-making and public understanding. On the basis of my experiences over the last 2 years, I offer three principles for navigating this dilemma when engaging in scientific advising:   

1. Focus on decision-makers’ mental models.

Decision-makers’ mental models of people matter hugely. If decision-makers think of citizens as panic-prone, they will downplay dangers. If decision-makers think of citizens as ignorant, they will downplay complexities.  

Elinor Ostrom forcefully made this point in the 1990s, in her criticism of rational choice theory. She wrote that the theory helped facilitate an understanding among decision-makers that citizens are cynics who don’t trust each other, and that this understanding became a self-fulfilling prophecy because of the policies such a mental model gave rise to. In the 2020s, my own concern is less with rational choice theory and more with behavioral economics’ emphasis on biases, which indeed could lead decision-makers to think of citizens as both panic-prone and ignorant.  

Accordingly, my focus in providing advice has been less on concrete findings from specific studies and more on shaping decision-makers’ overarching mental models: How should they think about their audience when making and communicating decisions? These mental models should be grounded in not just a few studies but a whole line of research and should, on average, be less prone to replicability issues.  

In particular, I have found it useful to consistently refer to research on crisis behavior that speaks against the notion of a panic-prone public, to collective-action theory that encourages the prioritization of trust, to protection-motivation theory on the importance of empowering people rather than simply speaking to their fears, and to procedural-fairness research on the importance of impartiality in decision-making and prioritizing the transparent sharing of information, whether good or bad.  

This focus on the mental models of decision-makers has opened my eyes to a neglected research topic. We know a lot about laypeople’s understanding of and assumptions about human nature. But what are decision-makers’ mental models of human nature, and how are these models shaped by widely covered research—such as, for example, behavioral economics?  

2. Focus on blind spots.

When providing scientific advice in a health crisis, it is natural to focus on health aspects and to consider how the social sciences can support policies in this regard. For example, how should compliance with the advice of health authorities be encouraged? How can we communicate effectively about vaccines? But while policymakers and epidemiologists—in my experience—see the relevance of social sciences in this regard, you should consider yourself a broader representative of your field and ask if there are challenges that other disciplines, including health research, may be blind to. If so, there is no one but you to bring it to the table. For example, it is clear from a social science perspective that many interventions against disease spread may fuel political discontent and have unintended negative effects on people’s well-being. In Denmark, reliance on interdisciplinary scientific advice, for example, led to the recommendation of a pandemic management strategy that not only focused on infection spread but explicitly tried to balance four factors: infections, the economy, well-being, and the democratic rights of citizens.  

3. Focus on data that help identify problems.

Decision-makers around the world have focused on data relating to cases, deaths, and emerging variants. What the social sciences have to offer in this regard is data on the behavioral antecedents of infections–for example, mobility and survey data. Are people motivated to comply with advice, do they worry about the spread of the virus, and do they feel capable of protecting themselves and others? In addition, the project that I am directing has been collecting survey data on fatigue, well-being, trust in the authorities and the government, and public support for different interventions. We have shared descriptive analyses of these data on a weekly basis, and sometimes even more frequently, with government, authorities, the media, and the public. Although such data do not show how to solve problems, they can help identify problems such as waning compliance, trust, and support.  

These data can also help—and, I believe, have helped—facilitate more balanced interventions against infection spread. If decision-makers receive evidence that citizens are motivated to comply, they can opt for softer recommendations rather than stricter policies. In this way, social science data can move pandemic management more toward a coproduction process in which citizens’ voices—both their frustrations and their support—are channeled directly to decision-makers. Such data also can stir public debate in more productive directions, as those who engage in the debates do not need to assume (sometimes self-servingly) what the majority standpoint is.    

Of course, such data require significant resources. First, the key for providing valuable data in an ever-changing crisis is continuous collection, week after week, and rapid analysis and sharing. Second, it requires the prioritization of data quality with regard to representativeness. For example, estimating public support for a set of restrictions often requires better data than the more common research task of deriving a generalizable estimate of the association between two variables.   

Just as decision-makers need to reveal uncertainties when communicating with the public, scientific advisors need to be acutely aware of uncertainties surrounding their advice when communicating with decision-makers. In this regard, I believe that people who doubt the utility of social science research during the pandemic have rightly pointed to the uncertainties that surround many concrete studies. These uncertainties relate to issues raised during the replicability crisis and to the inherent difficulty of whether findings generalize across time and space. Again, however, this does not make the social sciences useless.  

The social sciences might have less to offer than the health sciences on a tactical level during a pandemic, but I believe they have a lot to offer on the strategic level. Although the social sciences may not have concrete recipes for solutions to behavioral problems such as vaccine hesitancy (unlike recipes for creating the actual vaccines), they can strengthen pandemic management by broadening the set of factors that decision-makers consider, removing incorrect and unhelpful assumptions about public behavior among decision-makers, and providing evidence-based assessments of the magnitude of problems facing decision-makers. Indeed, this may be all that we—as citizens—would like from social science advisors. After all, almost all concrete policy decisions during a pandemic involve political trade-offs that are outside of scientists’ purview. Should vaccine mandates be introduced? Should we close schools or shops? Facing such political questions, the best social science can offer is to sharpen decision-makers’ understanding of the relevant trade-offs. By doing so, social science will make those decisions better.

Feedback on this article? Email [email protected] or scroll down to comment.

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.