The neurological bases for social interaction are the focus of a growing interdisciplinary research enterprise involving psychologists and neuroscientists who are hard at work unravelling the mysteries of human social behavior. At the APS 21st Annual Convention, Kevin Ochsner of Columbia University chaired the symposium “The Neuroscience of Social Interaction” featuring findings from research into how we perceive others and how we use those perceptions to evaluate or empathize with them.
Ralph Adolphs of the California Institute of Technology explores which areas of the face we use to distinguish subtle differences in facial expressions. He devised a study using “the Bubbles method,” in which a facial image is presented “as though we were looking through a piece of cardboard into which we randomly punched a bunch of little holes.” Participants were asked to determine whether the facial image was of a fearful or happy face. Results indicated that autistic individuals and people with amygdala lesions use different facial areas to determine whether an expression is threatening than do healthy controls: Healthy controls focus on the eyes, whereas autistic individuals and people with amygdala lesions focus on the mouth area. Interestingly, autistic individuals performed equally as well as controls despite this difference. Adolphs then presented an eye-tracking study indicating that autistic participants look at both the mouth and eyes, perhaps because of two competing processes: a natural tendency to look at the mouth area and social training to make eye contact.
Princeton University’s Alex Todorov presented a two-dimensional model of face evaluation derived from principal components analysis of multiple social judgments of faces. According to this model, faces are evaluated along two orthogonal dimensions: valence and power. These ratings closely align with judgments of trustworthiness and dominance: Faces with high valence are frequently judged to be trustworthy and faces high in power are frequently judged to be dominant. Todorov suggested that these judgments result from an evolved mechanism that signals whether something should be approached or avoided. Individuals are likely to approach a stranger they judge as trustworthy and avoid those they judge to be threatening. Consistent with this hypothesis, computer modelling showed that judgments of threat are based on similarity to angry expressions.
Ochsner discussed empathic accuracy — the ability to accurately understand the emotions of other people. Although prior research in this area relied on trait measures (asking participants to identify others as happy or sad, etc.), “there has been a striking lack of evidence that trait measures…predict an accurate understanding” of other people’s emotional state, prompting Ochsner to turn to imaging and behavioral studies. One imaging study examining which areas of the brain are activated when experiencing pain and witnessing others experiencing pain found overlap between the regions when the participant was trying to understand someone else’s pain. And a behavioral study examining whether empathic accuracy depended on both the empathy of the observer and the expressiveness of the target found that affect sharing predicted accurate understanding only for highly expressive targets and highly empathic receivers. Ochsner discussed several real-world implications for his study — including the prediction of relationship outcomes and measurement of social impairments. As Ochsner said, this research may prove true a quote from To Kill a Mockingbird: “You never really understand a person until you consider things from his point of view… until you climb into his skin and walk around in it.”