Rome may not have been built in a day, but a preconference workshop in Atlanta made significant headway in tackling a difficult subject by providing a comprehensive look at Institutional Review Boards or IRBs and human subject regulations in behavioral and social science research. Along with the Working Group on Human Research Protections in Behavioral and Social Sciences, APS staged a daylong, pre-convention workshop. Hosted by APS Fellow and Charter Member Felice Levine, the workshop was geared toward researchers seeking to explore issues of human subject protections, as applied to psychological science research.
Levine, who chairs the working group and is executive director of the American Education Research Association, noted that in addition to education, she and the working group had another goal: “We hope that one of our takeaway messages is that the more social and behavioral scientists on IRBs, the more we can develop greater sophistication for human research protections.”
AN HISTORICAL PERSPECTIVE
Joan Sieber, California State University-Hayward, reviewed the history of human subject protections, and how it has evolved. She went back as far as the 14th century in order to give a perspective on how regulations never meant to affect research can have a negative impact. “In 1300, Pope Boniface VIII issued his famous order against cutting up dead bodies, as knights would boil the bones of their comrades and ship them home to avoid burial on heathen ground. This ban was then generalized to include all human dissection. The ban … became very firmly rooted. The human body and mind were not considered an appropriate domain of science.” With the age of enlightenment came a departure from this mindset, and Sieber explained that using people in experiments became the norm, citing Pasteur’s rabies and Jenner’s smallpox vaccines as examples.
Sieber said many researchers believe that human subject protections began with the Nuremberg Code in 1949, which stated that the benefits of research must outweigh the risks, there must be voluntary consent, and the patient must be able to terminate treatment at will. But Sieber disagrees on the code’s place in history. The code, she concludes, had “zero impact on American scientists.” She cited the infamous Willowbrook and Tuskegee studies as examples of experiments that violated the code. “An important reason to study history is to gain perspective … and to help us from repeating the mistakes of the past.”
THE REGULATORY PERSPECTIVE
George Pospisil spoke on behalf of the federal Office of Human Research Protections. Pospisil, who works in the education division of OHRP, said that “No one, from a university VP to an IRB member, should be involved in human research without having first read the Belmont report,” referring to the eight-page report issued in 1973 by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The report laid out three ethical principles to be heeded when conducting research with human subjects: respect for persons, beneficence, and justice.
Looking specifically at the issue of risk in behavioral and social science research, Pospisil said, “social and psychological risks are real, and IRBs have a responsibility to deal with those studies.” But, he added, “risks must be minimal, and reasonable,” acknowledging the difficulties in identifying risks in psychology research. “Social and psychological research risks are very different than physical risks … there is very little empirical data on the likelihood of risk in psychological research. But risk can range from simple embarrassment, to some serious risks, such as emotional distress, psychological trauma, invasion of privacy, loss of social status.”
To illustrate his point, he used an example that came out of SUNY-Albany. In a psychology research experiment, students were asked questions about normal upbringing, and four of the first 10 participants went to the infirmary with severe emotional distress. Pospisil explained, “The study was suspended, went back to the IRB, consultants were brought in to work with the PIs. … It just shows you how easy it is, and how difficult it is to know if there will be harm.”
He also asserted that the Common Rule, the name given to the section of the Code of Federal Regulations that governs human subject research (45 CFR 46), is more appropriate for the review of behavioral and social science research than it might appear. “The Common Rule provides sufficient flexibility for IRBs to efficiently and effectively review non-biomedical research.” In particular, Pospisil noted the provisions in the rule for exempted research, expedited review, and waiver of consent or waiver of documentation of consent.
“I’m going to give you two concepts, and we have to work between the two concepts. I pointed out the federal regulations. Are they the ceiling?” he asked. “No, they’re the floor. Those institutions that have gotten into trouble … got the concept wrong – they were reaching for the floor. If you’re reaching for the floor, you’re not gonna make it.”
The Common Rule allows flexibility in review, such as allowing for expedited review or exemption from review, but these concepts must be understood if they are to be utilized.
“Exempt research is not defined by the principal investigator, it does not have to be defined by the IRB. It should be defined by a third party [e.g. a department head].” Pospisil added that while certain research is exempt from review, it is not exempt from ethical principles or institution policies. Moving from “exempt” to “expedited,” Pospisil made a very clear point, saying “expedited review does not mean review light.”
Appropriate to the workshop’s Atlanta location, Pospisil quoted a famous Georgian, President Jimmy Carter: “Our commitment to human rights must be absolute. The powerful must not persecute the weak, and human dignity must be enhanced.”
“That encapsulates everything the Belmont report said, and that’s what we should be talking about when we talk about reaching the ceiling,” said Pospisil.
APS Fellow Louis Penner, Wayne State University, addressed some of the issues surrounding IRB review of psychology protocols, especially the ambiguity surrounding informed consent. “If there’s one thing I could change about the IRB process across the US, it would be the homogeneity in how it’s done,” he said. One major issues is how the subject should be treated, he said. “The subject needs to know that it’s research. They need to be told the purpose of the research, but not necessarily the hypothesis.” He added that subjects also need to know how long they will be expected to participate, as well as what is going to actually happen to them.
Potential risks or discomforts must be disclosed to the subject. Penner was clear that this encompasses more than physical risk. Risk could be emotional, psychological, legal, or even fiscal. IRBs have the right to weigh any of these elements. Echoing Pospisil’s earlier sentiment, Penner reiterated that psychological risks are indeed real, and risk is not the sole domain of biomedical research.
“You have an obligation to a tell a person what’s going to happen to them, and what they can expect and what the reasonable risks are. We need to not only disclose physical risks, but any emotional, psychological, or legal risk or discomfort they may experience.”
While subjects must be informed of potential risks or discomfort, Penner says there is a flip side to that coin. “Participants need to know potential benefits. If there are none, say so. It is not necessary to provide a benefit to the patient.” He went on to add that in much research, the benefits are aimed at society and science, not necessarily the individual.
Penner noted some common problems with informed consent documents. He said that information should be written at a 6-8 grade level, and there also needs to be more information than is currently given. Often, said Penner, all of the procedures are not included, the risks are not appropriately described, the benefits can be overstated, and finally, the subject is not told that they are participating in research.
PREPARATION OF PROTOCOLS
Barbara Spellman, from the University of Virginia, spoke about the preparation of protocols and what to expect when going before an IRB. “Keep in mind the ethical principals. I’ve seen an IRB protocol that asks the question, what is the relationship between the researcher and the participants? You can get some pretty funny answers to this!” Spellman, from the University of Virginia, noted the red flags to watch for, before taking your protocol to an IRB.
“Are there any financial things going on? Any coercive possibilities? Keep in mind that the readership of the protocols is the membership of the IRB.”
Spellman, who currently serves as APS Secretary and holds both a PhD and a JD, encouraged researchers not to be intimidated by the IRB, or to be afraid of using scientific lingo. “Make sure your IRB isn’t afraid of the word ‘experiment.’ We’re not Doctor Frankensteins. Experiments don’t necessarily hurt people. There are ways to do experiments that pose only minimal risk. Make sure people on your IRB understand – just because it’s called an experiment doesn’t mean it’s some kind of scary thing.” On the subject of language, Spellman recommended keeping it simple as well as scientific.
“You need to use real language. You need to explain.” Efficiency is also desirable when bringing a protocol before an IRB. “You can get a number of experiments on one protocol, if you define in advance the dimension of a variable or the set of parameters under which you are going to operate.”
Spellman also tackled the issue of disclosure. “The rules about disclosure, the rules about deception, they were all really written with a biomedical slant. To them, deception is a dirty word.” But, she noted, the disclosure requirement in the Common Rule, which falls under the section on informed consent, can be waived, under the following circumstances: If there is minimal risk, if it will not adversely affect the rights and welfare of the subject, if it is necessary for the research, and if there is a subsequent debriefing when appropriate.
Spellman offered examples to illustrate when deception would be necessary in a psychological experiment, such as when one is studying mood manipulation, or performing surprise recall tests. But investigators must pose questions to themselves, such as “Will the nondisclosure/deception affect the rights and welfare of the patient?” or “Is subsequent debriefing appropriate for all information?”
“There is flexibility in your IRB,” she concluded.
For more information on this conference, visit www.aera.net/humansubjects