What Happens When Research Yields Unpopular Findings
Throughout history, scientists have found themselves the subject of scorn, slander, ridicule and even violence when their discoveries have failed to mesh with authoritative doctrine or public sentiments. When an ancient Muslim cleric was offended by Persian doctor Rhazes’s book on medicine, he had the man beaten with his own manuscript until he was blind. After Galileo’s telescope challenged the belief that the sun orbited the earth, the Holy Office of the Inquisition accused the astronomer of heresy and sentenced him to house arrest.
Today, most scientists are able to report their findings without worrying about draconian sentences from the state. But they still face the enmity of people who simply don’t believe the empirical results or who have a vested interest in the status quo. Individual ideologues, interest groups, industry lobbies, social networks, and even policymakers freely lash out at researchers whose work threatens their belief systems or their livelihoods.
These attacks are not new, but modern communications technologies have given science deniers far more potent tools to blast everything from climate science to vaccines. In addition to harassing phone calls and letters, they now can pummel researchers with hostile emails, or assail their integrity on blogs and other social media tools — all in relative anonymity. And in addition to questioning the validity of the science, the critics often resort to personal attacks on the scholars as a way to discredit the data.
Psychological researchers have in no way been immune to these tactics. In fact, some have weathered frightening vitriol and threats to their reputations. Back in 1975, US Sen. William Proxmire bestowed the first of his infamous “Golden Fleece” awards on a small federal grant given to APS William James Fellows Elaine C. Hatfield of the University of Hawaii and Ellen S. Berscheid of the University of Minnesota. Proxmire denounced their study on social justice and equity in romantic relationships as a waste of taxpayers dollars. The publicity generated threatening letters and phone calls to both scientists, and their federal funding dried up because of the stigma.
In the 1990s, renowned memory researcher and APS Past President Elizabeth F. Loftus, at the University of California, Irvine, drew considerably hostile reactions when her studies challenged people’s claims that they had uncovered — often with the help of therapists — repressed memories of abuse, molestation, and even alien abduction. Loftus even had to have armed guards accompany her to lectures after she received death threats.
Conspiracies and Denials
Stephan Lewandowsky at the University of Bristol, in the United Kingdom, and University of Western Australia, has been among the most recent psychological scientists to be targeted — oddly, for his studies on the very psychological variables that lead to people’s acceptance or rejection of science. In a high-profile paper titled “NASA Faked the Moon Landings — Therefore (Climate) Science Is a Hoax: An Anatomy of the Motivated Rejection of Science,” published earlier this year in Psychological Science, Lewandowsky detailed his research suggesting that people who reject climate science also tend to believe in assorted conspiracy theories, such as the 1969 lunar landing being a hoax and AIDS being a disease unleashed by the government.
Lewandowsky’s study involved questioning people who write and read blogs related to global warming. He surveyed the individuals about their views on climate science, other scientific propositions, and their environmental leanings; their perceptions of what scientific “consensus” means; their beliefs about free-market economics; and finally, their views on a number of well-known conspiracy theories ranging from fears of a World Government (a right-wing idea) to the belief that 9/11 was an “inside job” (typically embraced on the left).
In examining the results, Lewandowsky found that those who support unrestricted capitalism were much more likely to strongly reject climate science — probably, he surmises, because it portends regulations on the marketplace. But he also found that free market advocates were more likely to reject other established scientific findings, even the (undisputed) facts that smoking causes lung cancer and HIV causes AIDS. They also believed in theories unrelated to the environment, such as NASA staging the moon landing or the CIA having killed Martin Luther King, Jr. Lewandowsky concluded that some people have a cognitive style that leans toward beliefs in conspiracies, and this makes them prone to reject scientific facts.
His study prompted a flood of denunciation, primarily from people who deny that humans are the major cause of climatic changes, or who deny that the climate is changing at all. The detractors described the research as malicious, incompetent, unscientific, agenda-driven, and unethical. Some even called for the journal to retract the article pending an investigation into Lewandowsky’s conduct. The journal, and Lewandowsky’s university, stood behind the study. The critics were invited to submit a commentary for publication in Psychological Science, but never acted on that invitation. Lewandowsky replicated his study with a large representative sample of the US population. The peer-reviewed study, with a virtually identical outcome, recently appeared in PLOS ONE (Lewandowsky, Gignac, & Oberauer, 2013).
Suppression vs. Denial
Sometimes scientists find themselves under attack because their work is viewed as threatening to the vested interests of the very subjects they’re studying. Lisa Lit, a University of California, Davis, scientist trained in both experimental psychology and genetics, experienced this in relation to her work with drug- and explosives-sniffing dogs. In a study, results of which were published in Animal Cognition, Lit and her research team found that dogs’ performance was swayed by subtle, unintentional cues from their handlers. The dog/handler teams erroneously “alerted,” or identified a scent, when there was no scent present more than 200 times — particularly when the handler believed that there was scent present. Lit’s research carries big implications for criminal prosecutions. The idea that detection dogs are essentially responding to their handlers, rather than truly sniffing out explosives or drugs, allows defendants to argue infringement on their Fourth Amendment rights against unlawful searches.
The setting for the study was a church — selected because it was unlikely to have ever contained either explosives or drugs. The researchers created four separate rooms for the dogs to examine or “clear.” The handlers were told that there might be up to three of their target scents in each room, and that there would be a piece of red construction paper in two of the rooms that identified the location of the target scent. However, there were no target scents — explosives or drugs — placed in any of the rooms.
Each room represented a different experimental condition or scenario: in one, a piece of red construction paper was taped to a cabinet; another had decoy scents — two sausages and two tennis balls hidden together. Another condition had red construction paper placed at the location of the hidden decoy scents, and another had nothing at all. The dog-handler teams conducted two separate five-minute searches of each room. When handlers believed their dogs had alerted — indicated a target scent — an observer recorded the location indicated by handlers. Search orders were counterbalanced; that is, all teams searched the rooms in a different order.
Although the dogs should have identified nothing in the rooms, they delivered alerts in all of them. Moreover, there were more alerts at the locations indicated by construction paper than at either of the locations containing just the decoy scents or at any other locations. That is significant, Lit said, because there were more alerts in target locations cued by human suggestion — the construction paper — than at locations of increased dog interest — the hidden sausage and tennis balls. There were also alerts on a wide variety of other locations, indicating that the dogs were not simply alerting in the same locations where other dogs had done so.
Lit, who was previously a detection-dog handler, said the study should be replicated with dog teams being videotaped to carefully assess hidden cues that handlers might be giving. Lit said many people in the working-dog industry had discouraged her from publishing her findings. Once they did appear in the journal, she learned through sources that people were trying to shut down her research. She was accused of trying to stop the use of detection dogs, which she says is patently untrue.
“Our goal is only to optimize performance and provide evidence for what strategies can optimize performance in detection dogs,” she said.
She added that many dog training operations told her they were aware that her findings uncovered a real problem in canine detection, but that their own methods were designed to protect against those unconscious cues.
“My standard response is, ‘Great, let’s collect the data looking at your method. This would provide valuable evidence for the industry.’”
Psychological scientists find themselves particularly subject to intimidation campaigns in the legal arena. Loftus’s memory studies involving tens of thousands of subjects are classics, and have been replicated hundreds of times. She has forever toppled the concept of memory as an accurate recorder of life experience. Over the past 40 years, she has demonstrated that eyewitness testimony is often unreliable, and that false memories can be triggered in individuals merely through the power of suggestion.
But her research pursuits didn’t come without peril. She endured numerous attacks on her credibility. Prosecutors excoriated her when they lost cases after she testified that eyewitness accounts were fallible. Alleged abuse victims who claimed they had recovered memories of the trauma scorned Loftus when she testified for the people they accused. Clinical psychologists ostracized her. A stranger on a plane once slapped her with a newspaper.
More recently, APS Fellow Saul Kassin of Williams College, who studies the factors that lead criminal suspects to make false confessions, found himself the focus of intimidating emails and disparaging blog posts. In Kassin’s case, the criticism was aimed not at his research per se, but at the application of it. The trouble began after Kassin authored a paper titled “Why Confessions Trump Innocence” published in 2012 in The American Psychologist. In that paper, he argued that a false confession can cascade into other evidence.
“Studies (as well as real-life cases in the United States) also specifically show that the presence of a confession, because it creates a strong belief, can contaminate latent fingerprint judgments, eyewitness identifications, and interpretations of other types of evidence,” he wrote.
But what particularly inflamed the blogosphere was Kassin’s use of a headline-grabbing example — the case of Amanda Knox, an American college student who was convicted of murder. Kassin had provided a pro bono analysis of Knox’s case in her appeal to the Italian court, recommending that her confession be treated with caution. He noted that Knox had been immediately identified as a suspect and presumed guilty, confessed after three days of denials and interrogations, and did not have any attorney present when undergoing questioning. In addition, Kassin pointed out, her statements were not recorded.
“I used it as an example, not realizing the depth of a couple of Amanda Knox hate groups that track professionals who support Amanda Knox,” he said.
Kassin said the hate emails he received, and the blog posts criticizing him, didn’t focus on the science itself, but on his motives for analyzing Knox’s case. In essence, the attacks were personal. Some of the messages he received felt threatening, he said, and included statements such as: “We know where you work.” A few bloggers also wrote posts lambasting Kassin’s integrity, in one case even calling him a “shill.”
Scientists who have been subjected to these tactics say universities, journal editors, professional organizations and others need to support scholars who face these threats to their academic work. In an on the Observer web page this month, Lewandowsky, Loftus, and other researchers raise particular concerns about the tactics used to intimidate journal editors to keep them from publishing articles. Lewandowsky says that over the past year, his work has been subject to numerous requests for correspondence and other documents under freedom-of-information laws, which apply to public universities. He and his co-authors raise particular concern about efforts by critics who have no experience in psychological research injecting themselves into the peer-review process — in some cases sending bullying emails to journal editors — “to prevent the publication of findings they deem inconvenient.”
“Knowledge of the common techniques by which scientists are attacked, irrespective of their discipline and research area, is essential so that institutions can support their work against attempts to thwart their academic freedom,” they write. “This information is also essential to enable lawmakers to improve the balance between academic freedom and confidentiality of peer review on the one hand, and the public’s right to access information on the other. Finally, this knowledge is particularly important for journal editors and professional organizations to muster the required resilience against illegitimate insertions into the scientific process.”
Kassin said that scientists also need support because
opponents can be particularly powerful or influential — and often obscure or anonymous.
“Part of what is unnerving about harassment like this is you don’t know who your adversary is,” he said. “The people who come at us are nameless and faceless, and sometimes they have resources. It’s hard to fight that.”
References and Further Reading
Lewandowsky, S., Oberauer, K., & Gignac, G. E. (2013). NASA faked the moon landing — Therefore (climate) science is a hoax: An anatomy of the motivated rejection of science. Psychological Science, 24, 622–633.
Lewandowsky, S., Gignac G. E., Oberauer, K. (2013). The role of conspiracist ideation and worldviews in predicting rejection of science. PLoS ONE 8(10): e75637. doi:10.1371/journal.pone.0075637.
Loftus, E. F. (2003). On science under legal assault. Daedalus, 132, 84–86.
Lit, L., Schweitzer, J. B. & Oberbauer, A. M. (2011). Handler beliefs affect scent detection dog outcomes. Animal Cognition, 14, 387–394.
Kassin, S. (2012). Why confessions trump innocence. American Psychologist, 67, 431–445.
Leave a comment below and continue the conversation.