Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements

Psychological Science in the Public Interest (Volume 20, Number 1)
Read the Full Text (PDF, HTML)

Faces offer information that helps us navigate our social world, influencing whom we love, trust, help, and even judge as guilty of a crime. But to what extent does an individual’s face reveal the person’s emotions? And to what extent can we accurately interpret an emotion or intention from a raised eyebrow, a curled lip, or a narrowed eye? Understanding what facial movements might reveal about a person’s emotions has major consequences for how people interact with one another in the living room, the classroom, the courtroom, and even the battlefield.

In this issue of Psychological Science in the Public Interest (Volume 20, Issue 1), Lisa Feldman Barrett and coauthors Ralph Adolphs, Stacy Marsella, Aleix M. Martinez, and Seth D. Pollak examine the scientific evidence for the widespread assumption that a person’s emotions can be readily inferred from his or her facial expressions. The authors question whether the existing research is strong enough to justify the way it is increasingly being used by those who consume it (e.g., technology companies trying to figure out how to “read emotions”; schools teaching children emotional expressions; federal agents being trained to detect emotions and predict behaviors from facial movements; mental-health specialists using facial movements to diagnose and treat psychiatric disorders). They also recommend research that will yield a more valid view of how people move their faces to express emotions.

Research to date suggests that people tend to believe what the authors refer to as the common view — that certain emotions are revealed by certain facial-muscle configurations. To assess whether science supports the common view, the authors focus on the evidence concerning six emotion categories — anger, disgust, fear, happiness, sadness, and surprise — that have been the focus of much of the research on emotion. However, their report is not about whether these six emotions are the basic (or only) emotional categories but about whether a certain facial expression reveals a person’s emotional state.

Barrett and colleagues propose one can justify that a facial expression reveals something about a person’s emotional state if four criteria are met: reliability (e.g., a scowling face occurs whenever someone is angry), specificity (e.g., a scowling face rarely occurs when someone is not angry), generalizability (i.e., patterns of reliability and specificity must occur across different studies and for different populations), and validity (i.e., even when a facial expression is consistently associated with an emotion, it must be demonstrated that the person portraying the expression is really in the expected emotional state). Taking into account these four criteria, the authors examine studies of expression production (i.e., how people actually move their faces during emotional episodes) and emotion perception (i.e., which emotions are actually inferred from looking at facial movements). They cover studies of healthy adults in developed nations, of healthy adults living in small, remote cultures, of healthy infants and children, and of congenitally blind individuals.

The review of studies covering the production of facial expressions during emotional events indicates lack of support for the common view. The research shows, for example, that a smile might signal submission instead of happiness. Moreover, a dearth of systematic and controlled observations of facial movements and emotion in people from remote cultures limits knowing the conditions under which facial expressions may be linked to specific emotions. Barrett and colleagues suggest that using neutral phrases, such as “facial configuration” or “pattern of facial movements” might be more scientifically accurate than the misleading phrases “emotional expression” or “emotional display” given that each pattern of facial movement does not necessarily signal a specific emotion.

Studies of how people perceive emotions from facial expressions also failed to strongly support the common view. When individuals are asked to match facial expressions to emotions they can reliably do so, but when they are asked to generate the emotion label from a facial expression, reliability is low. Moreover, recent studies in remote cultures found no facial configuration to be specific of a given emotion, even though individuals could infer some social meanings from facial expressions (e.g., Trobriand Islanders labeled the proposed facial configuration for fear as signaling intent to attack). Thus, these findings do not seem to support the reliability, specificity, generalizability, and validity criteria to establish a direct relationship between facial expressions and emotional states.

In sum, these findings reveal that facial movements commonly thought to signal particular emotions regardless of context, person, and culture are not universally diagnostic of emotional states. But because the common view influences research and the public understanding of emotions, new research on emotion is needed. Barrett and colleagues conclude that there is an urgent need for research to examine how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another.  They propose that future research use multidisciplinary methods that include neuroscience and machine learning; include large-scale studies that bridge the lab and the world and assess facial movements, posture and gait, tone of voice, autonomic nervous system changes, gaze, as well as contextual variables in varied situations; support the development of computational models; and go beyond the classic views of emotion, testing innovative hypotheses about the nature of emotion.

In an accompanying article, Alan Cowen (Department of Psychology, University of California, Berkeley), Disa Sauter (Faculty of Social and Behavioural Sciences, University of Amsterdam), Jessica L. Tracy (Department of Psychology, University of British Columbia), and Dacher Keltner (Department of Psychology, University of California, Berkeley) propose a comprehensive atlas of human emotions. The researchers, who have led initiatives to create complete maps of human emotions and used methods such as modeling neural activity to explore the structure and dynamics of emotional experiences, consider there to be dozens of varieties of emotions, distinguished by language, evoked in distinct situations, and perceived in distinct expressions of the face, body, and voice. They argue that there are more than 20 emotions that are expressed not only by facial expressions but also by vocalizations and body movements, and that their expression varies culturally and depends on the situation. Thus, models that rely on only six “basic emotions” — anger, disgust, fear, happiness, sadness, and surprise — and consider only facial expressions do not capture the variability in emotional responses. In contrast, their proposed richer model of emotions can explain the various emotion-related responses (e.g., smiles of embarrassment, sympathetic vocalizations, blends of emotional expressions). They advocate for the use of large-scale statistical modeling and machine-learning methods to create a complete map of emotions and their expressions. This approach should help determine the full extent of what facial expressions, in conjunction with body and vocal expressions and contextual cues, can tell us about emotional states, Cowen and colleagues argue.

About the Authors (PDF, HTML)

Read coverage of this report in The Washington Post, Forbes, The Verge, and the ACLU’s Free Future blog

View the APS Press Release

Leave a Comment

Your email address will not be published.
In the interest of transparency, we do not accept anonymous comments.
Required fields are marked*

This site uses Akismet to reduce spam. Learn how your comment data is processed.