How Scientists Are Blocking Bias in the World at Large

Implicit bias, and the subtle systemic and interpersonal discrimination it feeds, is meeting its match: Psychological scientists working with diverse populations, including police departments, patients, and policymakers, are identifying and fostering new ways to combat latent forms of prejudice. Three researchers shared their own work in this area during a special event, “Real-World Implications of Implicit Bias,” at the 2017 APS Annual Convention in Boston.

In Policing

Modupe Akinola, a psychological scientist at Columbia Business School who studies the effects of stress on performance, said that scientists’ increasing examination of racial bias in law enforcement tends to overlook the role that physiological responses play in high-risk policing.

In a study of 87 police officers in a Massachusetts police department, she and APS Past Board Member Wendy Berry Mendes (University of California, San Francisco) examined the body’s reaction to high-pressure situations. By measuring participants’ cortisol levels before and after a stressful role-play scenario, the researchers were able to draw conclusions about how stress affected officers’ decision-making about whether or not to shoot armed or unarmed White or Black targets.

The results were intriguing: “As officers’ stress levels increased, they were more accurate in their decisions regarding Black targets in the computerized shooting simulation,” Akinola said. “We did not see that relationship with White targets. These officers had a particular competency with Black targets and did not make the same errors of shooting unarmed Black targets frequently seen in the media.”

When she and her colleague questioned members of the department about the findings, the officers said they were unsurprised: They had received extensive training about diversity and ways to combat racial bias. Their responses highlighted the importance of such training, Akinola concluded.

“If you change the culture and climate of these departments, this can help reduce bias. Additionally, if bodily stress responses can affect decision making under stress, this opens up new avenues to intervene and improve decision making,” she said. “One of these avenues is stress management training.”

Akinola is a member of a monitoring team appointed by a federal judge to oversee the reform efforts in the Cleveland Police Department. The goal of the initiative, a result of a consent decree between the Department of Justice and the City of Cleveland, is to tap into the knowledge of a range of experts, including social and clinical psychologists, to make a number of fundamental changes to the department’s policies, practices, procedures, training, use of data, and more.

In particular, Akinola and her colleagues are helping the department develop systems to collect, track, and analyze data more efficiently and effectively. “I can’t tell you how important this is,” she said. “For instance, few police departments across the country track their stop, search, and arrest data by race and by gender. Without these data, it is impossible to know if all citizens are being treated equally and fairly.”

At least as important, Akinola said, is making cultural changes within their departments: “Cultural beliefs, practices, policies, and norms within police departments can sometimes perpetuate racial disparities in treatment,” she said. “This is also where a lot of the change needs to be made, and where social and behavioral scientists can contribute valuable insights.”

Akinola stressed the importance of social scientists engaging in police reform efforts to help bring the interventions tested in the lab into the field in an effort to improve cross-race interactions within police departments and between police and the community.

“Integrating social psychological research with practice in the world of policing will have significant effects in terms of reducing implicit bias,” Akinola noted.

For Medical Care

APS Fellow Louis A. Penner, Wayne State University, Karmanos Cancer Institute, and University of Michigan Research Center for Group Dynamics, takes a microlevel approach to examining implicit bias by zooming in on racially discordant medical interactions.

Penner opened with a sobering example of why this line of study is so critical. He noted that, despite there being no significant difference in the incidence of breast cancer in Black versus White women, Black women are much more likely to die from it than are White women.

Conventional wisdom suggests that this kind of disparity has its roots in genetic factors or socioeconomic status — if White women seeking treatment are, on average, wealthier than their Black counterparts, they may be receiving better health care. A 2003 Institute of Medicine report on health care, however, suggested that another cause was unequal treatment based on patients’ race: In clinical settings, Blacks were not treated as well as Whites, even when their health insurance was equivalent.

One factor contributing to this unequal treatment, Penner said, is that more than 80% of Black patients’ clinical interactions are racially discordant, meaning their provider is not Black. He reported that researchers have found that these interactions are shorter; less positive, productive, and informative; less patient-centered; less engaged; and less satisfying to patients. Penner argued that one reason for this was physicians’ race-related attitudes.

“When physicians and patients enter a clinical interaction, they do not do so tabula rasa,” he noted. “They both bring with them race-related attitudes.” Penner focused, however, on physician racial bias.

While physician explicit bias tends to be relatively low, Penner said, implicit bias is often high; in addition, “even if there is explicit bias, these expressions of conscious race-related attitudes are probably well-controlled” due to physicians’ training. Automatic, nonconscious implicit bias, however, may be much more difficult to manage, and such bias can affect the treatment Black patients receive, both immediate and long-term.

To examine this phenomenon, Penner and his colleagues examined the effects of race-related attitudes in racially discordant primary-care interactions, using a sample of 156 Black patients and 18 non-Black physicians in a primary-care clinic. Before the
patient–physician interaction, the researchers measured physicians’ implicit and explicit bias using the Implicit Association Test and patients’ perceived past discriminatory experiences. Afterward, they measured patient satisfaction and trust and asked Black and White observers to rate thin slices of physician affect from videos. Penner, who is White, was shocked to discover how differently he and the Black participants viewed the encounter.

“These are highly scripted, 15-minute interactions, in which I, I am embarrassed to tell you, see nothing in the physicians’ behavior,” he said. “Yet in those 15-minute interactions, the Black patients … were picking up on their physicians’ implicit racial bias and reacting negatively. They thought the higher-bias physicians cared less about them, and they trusted these physicians less.”

In another study, Penner again delved into the effects of implicit physician bias on Black patients and their non-Black physicians. They used a methodology similar to that in the first study, with a sample of 114 Black oncology patients and 18 non-Black oncologists, and found similar results. Oncologists who scored higher on implicit bias had shorter interactions with patients, were rated as less supportive by blind observers, and were visited less by patients.

Importantly, Penner said, patients who interacted with higher-implicit-bias physicians reported less confidence in recommended treatments and expected more difficulty completing them: “Implicit bias not only affects how patients feel about the doctors, but how they feel about the treatments that the physicians have recommended.”

Penner outlined several potential solutions that could be implemented with the help of psychological scientists. Clinicians, he said, should be encouraged to individuate their patients and communicate with them, during both initial consultations and follow-ups; and patients should be given strategies to ask more relevant questions about their treatment. Both should reduce the effects of physician bias. On a system-wide level, he suggested organizations aggregate data from their healthcare systems to gain awareness of existing racial disparities and work to standardize treatment of Black and non-Black patients.

On the Job

APS Fellow Naomi Ellemers, Utrecht University, is looking at discrimination within another system: the workplace. Ellemers, whose work focuses on social identities and relations between groups, has found that people must believe that discrimination exists on a systemic level before they are willing to change.

“People experience some sense of loss when they realize that bias persists despite their best intentions. That has to be overcome,” she explained. “A first requirement for people in experimental studies or in real life to be motivated to change anything about the situation is that they first need to be convinced that there is even a problem with discrimination and systematic inequality.” Otherwise, she said, they assume that if someone lacks opportunities — such as access to a good school, health care, or job training — it is due to some personal failing.

In a study that explored ways to reduce implicit bias, Ellemers and colleagues found that emphasizing the moral implications of such bias helped people suppress it. Other solutions included testing participants in the presence of the target group (in this case, Muslim women); underscoring the importance of being a “good” in-group member (as rated by approval or disapproval from another in-group member); and highlighting a common identity (such as shared gender or religion).

Ellemers noted that people often have a hard time adjusting to the idea of prejudice on a structural level.

“People become very depressed and discouraged, because they say, ‘If I’m not aware that I am discriminating or that I am biased, how can I change this?’” she said. “It’s very disconcerting for people, because they feel like they cannot control that outcome.”

Ellemers has collaborated with universities and government entities to combat this phenomenon. With a group of colleagues who call themselves Athena’s Angels, she collected personal stories from women who had experienced implicit discrimination. An artist helps them communicate these stories. Athena’s Angels worked with the Dutch Minister of Education and the president of the Royal Academy of Arts & Sciences to reduce implicit bias by requiring Dutch universities to hire an additional 100 female professors and to select an extra cohort of female Royal Academy members. The initial feedback was mixed — employers were afraid they would be forced to hire underqualified candidates, and women were afraid they would be stigmatized as a result. Instead, “We got these surprised comments: ‘Oh, there are so many talented women out there, and we just [hadn’t found them before]!’” Ellemers said. “I think a lot of difference was made because we realized that we can’t just give people the information about implicit bias. We also have to work with them.”

In another context, Ellemers is helping employers overcome their unintentionally prejudicial attitudes when selecting students for vocational training. In the Netherlands, students are required to intern at a company to complete their education, but ethnic minority students often have difficulties obtaining such jobs, a disparity in which implicit bias is likely to play a role. To address this, she and her colleagues developed strategies to remind hiring managers of the professional identities they share with the students (e.g., the students hoping to gain experience in the field from the employers); they also created evidence-based guidelines for the employers to use.

All three psychological scientists stressed the fact that no one is immune to implicit bias and that such attitudes will not change overnight. The key response, they said, is to give people the tools they need to address and overcome their bias and to make those tools easily accessible to the target audience — and, if possible, the general public. As Ellemers said, just providing the information isn’t always enough.

“You have to make it nice, or fun, or give people rewards for positive behavior,” she said. “In our country this has had a huge impact.”

Leave a Comment

Your email address will not be published.
In the interest of transparency, we do not accept anonymous comments.
Required fields are marked*

This site uses Akismet to reduce spam. Learn how your comment data is processed.