Presidential Column

A New Day for Human Subjects Research Participation

Susan T. Fiske

No, the recent Facebook mood-manipulation flap is not the reason — but more on that later. A recent government initiative will likely change human subjects protection programs for the better. When does that ever happen with new regulations? What’s coming next year, maybe, are:

(a) clearer boundaries on what is “human subjects research” requiring review;

(b) a new, streamlined, “excused” category not subject to institutional review board (IRB) oversight;

(c) a clarification of “minimal risk”;

(d) elimination of continuing review of “expedited” research; and

(e) simplification of informed consent; among other innovations.

Here is some background for IRB geeks: The revision process began in July 2011, when the US Department of Health and Human Services (HHS) issued an Advanced Notice of Proposed Rulemaking (ANPRM) for the “Common Rule” Code of Federal Regulations Title 45 Part 46 (45CFR46), netting about 1,000 public comments.

As a researcher, my first reaction was surprise at how sensible the proposed revisions were (mostly). As a decade-long IRB chair, my first reaction was gratitude for rescue from overload.

The potential good news matters for our scientists, for our research participants, and for human subjects research oversight. Recognizing the ANPRM’s import, the National Academy of Sciences convened a National Research Council (NRC) panel on the implications for behavioral and social sciences. The ANPRM’s proposed revisions had come from a more biomedical model, so our sciences’ perspectives seemed useful. That NRC panel, with 14 members and myself as chair, completed a fast-track report.  Here are some recommendations especially relevant to most APS members who do human subjects research:

(a) Define publicly available information as not human subjects research, even if information is identifiable, as long as individuals have no reasonable expectation of privacy (Rec 2.3).

Public data could include:

  • Observing, coding, and recording behavior in public places (including Internet and other digital data).
  • Adopt ANPRM’s new category of “excused” research for no-greater-than-minimal information risk, even if these methods query physical or psychological well-being (Rec 2.5–2.7).

“Excused research” could include:

  • Use of pre-existing data even with private information if a data protection plan is filed.
  • Benign interventions or interactions that are familiar in everyday life (educational tests, surveys, focus groups.

(b) Operationalize the new “excused” category (Rec 2.8, 2.9).

This process could include:

  • Registration with IRB.
  • Study can begin in 1 week.
  • Studies with interaction or intervention, if subjects consent.
  • Data protection plan.
  • IRB small-subset audit — prospectively (while awaiting approval) or retrospectively (e.g., once a year, to monitor investigator judgments).

(c) Clarify “minimal risk” (Rec 3.1).

Minimal (everyday) risk could include:

  • Routine “educational examinations or tests” (besides medical and psychological ones already covered).
  • Specification of the reference category “the general population” (because everyday risks of subpopulations differ but should not justify differing standards).
  • No longer identifying certain populations as necessarily “vulnerable to coercion and undue influence,” requiring additional but unspecified protections (Rec 3.2).
  • Taking into account whether and how risks are minimized (Rec 3.3).

(d) Distinguish “excused” (above) and “expedited” categories.

Expedited (not full-IRB) review could include:

  • Decisions based on the specific research procedures and subject populations (Rec 3.4).
  • Elimination of required continuing review of expedited research (Rec 3.5).

(e) Simplify and clarify informed consent.

Streamlined consent could include:

  • Consent processes tailored to context and population, not standardized forms (Rec 4.1, 4.2).
  • Separate statements regarding institutional or sponsor liability (Rec 4.3).
  • Inclusion of populations other than “competent adults” in excused studies, using tailored consent processes (Rec 4.4).
  • Removal of proposed ANPRM requirement for re-consent for potential future use of pre-existing, de-identified non-research or research data (Rec 4.5).

A separate editorial that I coauthored with Robert Hauser, director of the NRC Division of Behavioral and Social Sciences and Education, summarizes the NRC panel’s Big Data recommendations. I won’t repeat those here, except to say that the panel recommends not using the Health Insurance Portability and Accountability Act (HIPAA) as a privacy standard governing research. It both overprotects (includes some unnecessary constraints) and underprotects (does not guarantee anonymity). HIPAA would be bad news in the age of Internet and administrative data. More tailored approaches work better.

This brings us back to the Facebook flap. For those who were off the planet in June, PNAS published an article that used a pre-existing Facebook experiment with nearly 700,000 users’ newsfeeds. In practice, Facebook already relays to users’ newsfeeds only a fraction of their friends’ posts, using various proprietary algorithms presumably determined by experimenting to see what keeps users engaged. Apparently unaware of this habitual practice, many users were outraged at being manipulated without any consent besides the user agreement that nobody reads.

In the PNAS article, the valid scientific question was whether seeing mostly their friends’ cheerful (versus mostly gloomy) posts would make users themselves more cheerful in their own posts — or make them envious and depressed. The academic authors involved did run their reanalysis past their IRB, which declared it exempt from review as pre-existing data. I served as editor for the study and, with expert peer review and ethical review, approved publication of the innovative and important paper. Although the public uproar focused on Facebook manipulating user newsfeeds, the ethical standards for Facebook’s internal review are not available because Facebook as a commercial enterprise is not obligated to follow the HHS Common Rule for human subjects protection. PNAS expressed concern about these issues, and Facebook is reportedly revising its internal ethical review.

Meanwhile, we can only hope that HHS revises its Common Rule covering academic research and that the NRC report is useful in representing social and psychological science.

References:

Department of Health and Human Services, Office of the Secretary (July 26, 2011). 45 CFR Parts 46, 160, and 164; Food and Drug Administration; 21 CFR Parts 50 and 56.   Human Subjects Research Protections: Enhancing Protections for Research Subjects and Reducing Burden, Delay, and Ambiguity for Investigators. Federal Register, 76 (143), 44512-31.

National Research Council. (S. T. Fiske, Panel Chair). (2014). Proposed Revisions to the Common Rule for the Protection of Human Subjects in the Behavioral and Social Sciences. Washington, DC: The National Academies Press.   Produced by the NRC Board on Behavioral, Cognitive, and Sensory Sciences, Barbara Wanchisen, Director.

Fiske, S. T., & Hauser, R. M. (2014). Protecting human research participants in the age of big data. Proceedings of the National Academy of Sciences USA.

Kramer, A. I. D., Guillory, J. E., & Hancock, J.T. (2014) Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences USA, 111(24), 8788–8790.

Verma, I. M. (2014) Editorial expression of concern: Experimental evidence of massive scale emotional contagion through social networks. Proceedings of the National Academy of Sciences USA 111(29),10779.

Comments

If these changes are implemented, this can only be good news for behavioral science. None of the changes indicated implicate bad ethical practice, and speeding up research is an essential ethical practice in a modern world.

Long overdue reforms. Just how long? I was the chair of the UCLA IRB in the late 1970’s, and they were overdue then.

BRAVO!!! Long overdue. Now if we can just get these recommendations implemented.

Thanks, Susan, for your hard work on this. All psychological researchers should be grateful for your time and effort. I hope it pays off.

Does anyone know what the rationale — or at least irrational basis — might be for the common requirement that there be continuing IRB approval to analyze data for previous completed studies when there is no possibility of linking information to participants” identities?

When I asked our IRB chair about this I was told that our IRB regulations — many of which are silly and ethically irrelevant — are mandated by federal regulations. I was on our IRB for 14 years and noticed clear “mission creep” away from protecting participants toward avoiding lawsuits, and then perhaps toward the IRB feeling it is their job to closely monitor the details of what researchers are doing, whether relevant to protecting participants or not. Does anyone know if there is a basis in federal regulations (or the behavior of federal funding agencies) for this sort of thing — and if so, what that basis is and what documents explain this?


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.