Practical Advice for Working with Institutional Review Boards
ay “eye are bee” and you’re likely to get responses ranging from heated exasperation to blissful contentment. Therein lies one of the most vexing problems of human research protection at institutions across the country: the colossal variability of institutional review boards (IRBs) and researchers’ relationships with them.
A slew of reports and commentaries on IRBs in the last several years, from such organizations as the American Association of University Professors, the University of Illinois, and the Society for the Science of Clinical Psychology, have detailed the problems researchers encounter and have called for, among other things, an overhaul of how IRBs are run. Although these reports are important to long-term improvement of IRBs and human subjects protection, there are also immediate and practical concerns: for example, how do we help the behavioral science researcher who just got her first big grant, has never met her IRB, and who’s heard nothing but bad things about it?
To that end, APS is focusing on best practices, with the assumption that the more we can collect and share concrete examples and suggestions for dealing with potentially thorny IRB issues, the better. We don’t know if, when, or how federal guidelines will change; in the meantime we want to facilitate the IRB process so that the research gets done.
In May 2007, Public Responsibility in Medicine and Research (PRIM&R) held a conference in Colorado focusing solely on social and behavioral research IRB issues. Geared mostly toward social and behavioral science IRB administrators (including chairs and committee members), few research advocates attended. We at APS usually hear the researcher’s side of the IRB story, but rarely do we hear the perspective of the IRB itself. This conference gave a glimpse into their world and a chance to understand why there’s tension between researchers and IRBs.
One of the most valuable things about the conference was a panel entitled “How to Get Past No,” in which senior social and behavioral science researchers gave talks full of practical advice. The panelists included Simon Rosser, University of Minnesota; Jerry Burger, Santa Clara University; and Roxane Cohen Silver, University of California, Irvine. In this issue of the Observer, Simon Rosser provides a summary of his talk, which dealt with HIV prevention research. In subsequent issues we’ll publish a similar article by Jerry Burger, describing how he convinced his IRB to allow a replication of the infamous Milgram experiment, long deemed unethical to replicate, as well as advice by Roxane Cohen Silver on how to work with your IRB to allow risky research such as on post-disaster coping mechanisms.
Much of the advice that’s dispensed is all well and good for established researchers with successful track records. But what about young researchers who don’t yet have those credentials? The best thing to do is to get to know your IRB, preferably under the mentorship of a senior researcher who is familiar with the idiosyncrasies of the native IRB. IRB training doesn’t hurt, either. The latter is often provided by the IRB administrator or institutional official who oversees the IRB. These can prove valuable in navigation.
A common problem for many social and behavioral IRBs is the unsupervised student who submits a poor protocol. The board members take issue with the faculty advisor, not the student, and the process becomes long and protracted with all the going-back-and-forth to fix the protocol. The message is a simple one: Faculty advisors need to pay closer attention to what their students submit, get involved in the research design early on, and not expect the IRB to do this for them.
Social and behavioral science research is going global, and this presents a whole host of issues that researchers and IRBs need to grapple with. One is informed consent — how to tailor it to different cultures and the differences in obtaining it from individuals in those cultures. This is done on a case-by-case basis, as the Office of Human Research Protections (OHRP) hasn’t yet had the resources to provide solid guidance on this.
Tanya Broesch, a graduate student at Emory University who conducts cognitive psychological research in Fiji, finds that it’s well worth the time and energy to educate the IRB as to why, for certain rural populations, verbal consent is much more culturally appropriate (and thereby more likely to yield accurate data) than is written consent. The research participants find signing documents strange, Broesch says, and she is working with her IRB to include verbal consent in her protocol.
Another thing to get a head start on is approval from the local research institution. Jeff Victoroff, University of Southern California, conducts biobehavioral research in Gaza, and he advises researchers to identify a local university or research entity with an IRB to host the research (in his case, it was the Gaza Community Mental Health Programme). One should make sure that the entity has a U.S. Federal-Wide Assurance (FWA; proof that it complies with U.S. standards for ethical review). If not, you can assist the local IRB in getting an FWA by referring to the OHRP rules (http://www.hhs.gov/ohrp/assurances/assurances_index.html#international).
Once the foreign IRB has secured its FWA, you can submit your research proposal to that IRB to make sure that the project conforms with local cultural sensitivities and geographic variants of research ethics. Once approved by the local IRB, you can then submit to your own IRB. If the local foreign IRB has approved your project and has an FWA, the home institution ought to approve pro forma.
Though not new, international social and behavioral research is growing exponentially and researchers and institutions need to work together to figure out the best way to proceed. Sandy Sanford, the Director of Research Subject Protections at George Mason University in Virginia, has overseen a number of international protocols submitted to her campus’ IRB. She says that when a Mason student or faculty member wishes to conduct research overseas, the IRB asks the researcher for a contact name of someone familiar with conducting research in the proposed country, to whom it sends a list of questions (modeled after OHRP’s) and a copy of the research protocol. This provides the IRB with the OHRP requirement for the IRB to have knowledge of the local research context. When the IRB receives the answers, it then proceeds with the protocol approval. What’s key here is knowledge of local context, and the better informed the IRB is, the easier the process will be.
This catchy phrase has been the subject of many papers. It means different things to different people, but all agree it’s not a good thing when it comes to IRBs. One way to stave off an IRB extending its tentacles beyond its purview is to be diligent about subject recruitment, especially with vulnerable populations (such as adolescent felons, for example). It can get sticky trying to separate out issues of protection from issues of research design, which can lead to mission creep. Sometimes an IRB will suggest a change in methodology when a proposed form of payment is considered coercive. By the same token, IRBs need to be very mindful of advising changes in research design only when it pertains to the protection of human subjects. The best way to figure this out is to have a dialogue with your IRB.
Researchers are increasingly turning to the Web to collect data. It can be a fast, easy, and cheap way to collect and distribute data from tens of thousands of participants. Web-based data collection is also useful and less intrusive for researching sensitive topics that might otherwise prove difficult in a face-to-face interview, such as sexual behavior or underage alcohol abuse. You can target specific populations and collect longitudinal data rather than having participants pay multiple visits to your lab.
There are pitfalls, however, to this method. You must take extra steps to ensure that confidentiality is maximized — for example, making sure that cookies (a cyber tool for tracking information about a user) won’t be placed on the participant’s computer. By the same token, if responses come back with IP addresses, you need to assure the participants that you’ll throw them out. You must also safeguard against multiple entries by the same participant. And of course, you’ll never know for sure who filled out the survey. For a specific example of Internet research design considerations in AIDS behavioral intervention research, see Pequegnat et al., (2007), Conducting Internet-Based HIV/STD Prevention Survey Research: Considerations in Design and Evaluation, AIDS and Behavior, 11, 505–521.
That said, there’s tremendous potential for Internet data collection, along with many uncertainties. Institutions are gradually establishing guidelines; check out Loyola University Chicago’s policy for collecting online surveys at http://www.luc.edu/ors/irbonlinesurveymenu.shtml. Investigate other institutions’ as well, and let your IRB know your ideas for a good internet usage policy.
Using the Internet for Something Else: IRB Improvement!
For the last three years, Missouri Western State University has developed and used a 100 percent online IRB system that has greatly improved their review, says Brian Cronk, professor of psychology at the university. There are commercially available systems, such as eResearch Portal, irbplus.com, and irbwise.com, but these are expensive (eResearch Portal is in the $100,000 range), and therefore out of reach for many smaller institutions who want to streamline their IRB process. Cronk worked with his university to create their online system, which he says has worked smoothly and is viewed as highly successful. In fact, the system has been proposed as the model for a national project entitled “OUR-IRB” which would make it available at no cost. The project is currently under review at the National Science Foundation. Even if it doesn’t get funded, it’s a good idea that others may want to pursue.
Best Practices: Let’s Go Viral
As labyrinthine as IRB requirements can be, there are just as many stories of success as there are of frustration. We commend the researchers who have shared their techniques and advice for working successfully with IRBs to get social and behavioral research done, and we encourage more networking on this issue. ♦
APS Members Share IRB Experiences
From Both Sides of the Trenches
The University of Texas-Pan American
Regulations for human research protection are confusing. I state this as someone who initially dealt with IRB issues as a researcher, then as an IRB member, and now as the chair of a largely social-behavioral research IRB. Although ever-increasing personal knowledge of the regulations has allowed me to put together well-written IRB proposals for my own research and to assist other researchers at our institution in doing the same, it is apparent that interpretation of the federal regulations varies from IRB professional to IRB professional and from institution to institution. This lack of clear “rules” is exaggerated when the regulations, written largely with biomedical research in mind, are applied to social-behavioral studies. If an institution has agreed to apply the federal regulations to research conducted by agents of the institution, then those regulations must be followed or the institution may risk loss of federal research funding. However, it is often easy to lose sight of the basic principles upon which the regulations were built. It is important for researchers and IRBs to work within the regulations and institutional policies, but to maintain focus on the principles.
There is flexibility built into the regulations. Sometimes it is not immediately obvious and sometimes IRB reviewers err too much on the side of caution (“better safe than sorry”). The federal regulations are often viewed as a minimum requirement, and many institutions have additional policies to safeguard human subjects. In my own experience, researcher frustration with the IRB is often resolved through open communication. When researchers recognize that IRB members are not purposefully creating difficulties, there is generally a meeting of the minds. Direct communication with researchers also helps to identify legitimate frustrations that can be addressed through improved education, revised submission forms, clearer (or even modified) policies, etc.
IRBs may also benefit by examining the nature of their Federal-Wide Assurance (FWA) with the Office for Human Research Protections; particularly in terms of whether it is necessary to agree to apply the federal regulations to research that is not federally funded. This may be an important consideration for IRBs that deal predominantly with social-behavioral research in that it may be an opportunity to develop institutional policies based on principles of the Belmont Report [a 1976 document outlining ethical principles and guidelines for the protection of human subjects of research] that are more closely tailored to the general nature of the submitted protocols.
The review process is not always a simple black and white decision; each protocol has its own unique issues that need to be considered within a regulatory framework that is itself open to interpretation. Researchers should be responsible for developing personal knowledge about human research regulations (and IRB institutional policy), knowledge that will assist them in crafting appropriate IRB submissions and navigating the process of IRB approval. For their part, IRBs must take it upon themselves to reflect on their true mission, develop policies and procedures within the regulatory framework that ease the process of IRB approval without diminishing human subjects protection, and open lines of communication with researchers.
Let’s Band Together
Texas Tech University
Boards need to be shown how to use the great flexibility of the regulations by credible, trustworthy sources. The investigators themselves are often dismissed as self-interested and uninformed. As a profession, I think we need to take advantage of the expertise and credibility that many psychologists have earned by being involved in human subjects protection. [We should] form a group of current and former IRB chairs (perhaps including highly experienced board members) to advise psychologists and their local IRBs on ways to resolve deadlocks (and efficiency problems, which may be the most widespread problem). Because of the dynamic nature of the regulatory environment, it is essential to have some people who currently serve. Occasionally, of course, the group might have to tell an investigator that a problem can’t be resolved. But, the fact that the advice comes from fellow psychologists ought to soften the blow. A list-serv by itself may be the most efficient way to accomplish this. I think we may be able to do some real good by providing a service based on the kind of experience and expertise that will get the attention of an IRB.
I would like to add another suggestion. Our board found in a couple of cases that empirical evidence concerning the impact of procedures on subjects can be very informing and compelling. Suppose there’s a procedure or deception that a board member might believe will cause significant stress — put that procedure into a scenario and (with IRB approval, of course), query subjects from the projected population about how they would feel if subjected to that procedure. It is hard for a board member to argue that procedures or stimuli are stressful if the subjects themselves believe it is not.
The Federal Side of the Fence
Focusing on best practices is no reason to neglect possible avenues for change. One channel is the Secretary’s Advisory Committee on Human Research Protection (SACHRP), which was established in 2002 (which was preceded by the National Human Research Protections Advisory Committee). The Committee meets periodically to discuss human subjects protection concerns, IRBs, and anything relevant to these. They explore special population issues such as children, prisoners, and the decisionally-impaired. SACHRP, which consists of 11 members, reports to the Department of Health and Human Services (DHHS), which then passes on the recommendations to OHRP. Its meetings are open to the public, so your voices can be heard through your friendly government relations specialist. That said, SACHRP is a government body after all, and there are 17 (no, that’s not a typo) federal agencies that would need to sign off on any changes, so as you can imagine, it would take a spot of time. Change does happen, just…slowly.
Working Constructively and Realistically with IRBs
B.R. Simon Rosser
University of Minnesota
Sex research is one of those areas where psychologists gain unparalleled experience in advancing sensitive research. Here are the major strategies I use to avoid potential barriers.
First, I conceptualize opposition to sex research as an incentive to become a better researcher and to be creative. Some examples may help. When we wanted to study how infecting people changed the risk behavior of persons living with HIV, the funding agency was unsure whether it was politically correct to fund such a study. By partnering with persons living with HIV, the funder was able to see its importance. Nationally, research on effective HIV prevention interventions targeting gay men has been hampered by conservative opposition framing such research as promoting homosexuality. So we use licensed health professionals and turn the opposition’s concerns into research questions to advance knowledge. In a state-level study, subjects reported political corruption, misuse of HIV funds, and religious intolerance in rural states as major barriers. To complete this study, we used an external IRB to overcome institutional opposition, used census data to verify expert opinions, and masked states’ identities to protect them from stereotype.
Second, it takes courage to be bold in research; I find a healthy disrespect for one’s own career helps. For me, the decision to engage in risky research ultimately comes down to two core questions: How important is the research to advancing science/truth and to stopping the spread of HIV?
Third, for all “socially sensitive” research, my teams develop a written political protocol. This includes an assessment of how politically risky the research is, identifying major risks and creating strategies to reduce or handle the risk. Our team then brainstorms the worst questions and misinterpretations we can imagine about this study, then, before the study begins, we design answers. All key stakeholders (university, health department, Congressional representatives of both parties) receive briefings. The meetings with key “reasonable” politicians are critical; we take the view that they need to know us, our work, and the key information important to them (the amount of federal funding our studies bring into the state) in order to be supportive.
Finally, I absolutely rely on my IRB as an external check to ensure all human subjects’ considerations are “nailed.” I view my IRB as being similar to a colleague who protects my back both from my own blind spots and external critics. Having worked with a dozen or so IRBs, two questions are worth considering: How good is our (IRB-researcher) relationship, and how good/competent is the IRB?
The key element in conducting cutting edge controversial research is having a good researcher-IRB relationship. The need to nurture that relationship is critical, but it’s such a well-kept secret, I’m not sure it’s taught. In my experience, the researcher-IRB relationship is a lot like dating or sex, with the good and bad experiences being remarkably similar. You know your relationship is healthy when you are productive and consult regularly; there’s mutual respect; you’ve memorized staff members’ names, phone numbers, and email contacts; you recognize them as an invested partner; you view them as part of your team of investigators; and when ultimately they become mentors as to how to do innovative research involving human subjects. Thanks to the University of Minnesota IRB, I am now researching new methods of consent in Web studies and dropouts during consent. I find I want to get through writing my application to get to the exciting part: the human subjects section.
In terms of competence, I divide IRBs into three kinds: good-to-great, moderate to weak, and the IRBs from hell. Here are my key characteristics distinguishing each:
Great IRBs are efficient, competent, research supportive (i.e., open to new approaches and methods); fair in their support for all areas of research; boundaried (keeping the focus on human subjects); respectful (in not competing for the role of the PI through redesigning and micromanaging studies); protective (from my own blind spots and charges that my work is not ethical); streamlined (actively thinking of ways to decrease researcher burden); approachable; and forgiving. When I have had to tell my IRB partner that I failed either my own standards or even the basic expectations in research, the staff listens, helps me think through what the best thing to do in the circumstance is; we usually develop a plan by phone, and then confirm that plan in writing.
Weak IRBs tend to be unfocused and disorganized. They are under-resourced, as they talk about wanting to help, but can’t. They appear vulnerable to intimidation and in considering reviews the name and reputation of the researcher, institution, or funding source gains excessive importance. Ultimately, they yield weak and inconsistent reviews, the quality is mixed or uncertain, and the IRB is described as “unpredictable.”
Then there are the IRBs from Hell. Their timelines unrealistically delay research. You find yourself questioning their purpose, motivations, politics, or competence. You feel bullied, harassed, powerless, or pressured to change what you believe is right. And you feel crazier after talking to the IRB than you did before. I can tell it’s bad when I find myself handing over the human subjects application to support staff because I do not have the time or energy to deal with the IRB, or I change the study simply to please or avoid the IRB. System-wide, they are avoided; with institutional staff advising on how to work around them. Saddest of all, you realize it is sane not to report any but the most serious adverse event to them. In my experience, these kinds of IRBs harm science.
As psychologists, I think it’s important to be role models of how to conduct studies well. Just as with any other relationship, when things go badly, it’s incumbent upon us to assess whether it’s us, the relationship, or the partner that is the problem. It is even more important to maintain and affirm good relationships. A good relationship with a great IRB is a wonderful asset to research.
In summary, sex and Internet research are amazing areas to study but are also good areas to borrow from in developing strategies to promote research.
Leave a comment below and continue the conversation.