When Things Don’t Go According to Plan

Reporting deviations in your preregistered research

Related content: Vazire Outlines Goals for Transparency, Diversity in Psychological Science

Methodologists have embraced preregistration—the process of publicly sharing a research design and analysis plan before a study begins—as a way to prevent questionable research practices and add transparency to scientific studies. But many researchers end up deviating from those preregistered plans, and those deviations aren’t reported systematically, if at all, two scientists wrote in an article for Advances in Methods and Practices in Psychological Science (AMPPS).  

Psychological Science is Strengthening Standards for Preregistration 

A decade ago, Psychological Science emerged as the first journal to award badges for studies that employ open science practices. Now, the journal has decommissioned those badges—including the insignia for preregistration of studies.* But that doesn’t mean the publication is dropping its preference for preregistered research.  

Instead, the journal will more closely scrutinize preregistered studies (as well as non-preregistered studies) for researcher degrees of freedom, new Editor-in-Chief Simine Vazire and Tom E. Hardwicke, Senior Editor for Statistics, Transparency & Rigor, write in an editorial in the journal. That’s largely driven by growing evidence of inconsistently and often minimally reported deviations from the original strategies submitted in advance of a study’s commencement (see related story).  

“The removal of the preregistration badge should not be taken as a sign that we do not value preregistration,” Hardwicke and Vazire write.  

Former editor Eric Eich introduced a trio of open practices badges in 2014. Experimenters who submitted their data to an open-access repository earned a blue open data badge, those who submitted materials to that repository earned an orange open materials badge, and those who submitted an open-access design and analysis plan created prior to data-collection garnered the red preregistration badge. The journal had awarded 296 preregistration badges, and 844 badges total, through 2022. 

But Eich noted that the badges were intended as an interim incentive in the field’s drive toward scientific transparency and rigor. His immediate successor, Stephen Lindsay, insisted that “badges are not an end in themselves.” 

Currently, a Psychological Science article can receive a preregistration badge no matter the level of detail in the research plan. In response, Vazire and her editorial team will require more specificity on preregistered studies. When submitting manuscripts, authors must state explicitly which aspects (hypotheses, design, etc.) of each study were preregistered, and whether there were any deviations from the original plan. Authors are strongly encouraged to fill out a preregistration deviation disclosure table, designed by psychological scientists Emily C. Willroth and Olivia E. Atherton, in their supplementary information.  

“Even a flawed preregistration with a clear deviation disclosure table is often better than no preregistration, as editors, reviewers, and readers can better evaluate the amount and nature of flexibility in data collection and analyses,” Hardwicke and Vazire write in their editorial.  

Reviewers and editors will take into account the seriousness of the deviations from the authors’ preregistered plans, and any unregistered steps, with respect to increasing the risk of bias. Not all deviations or unregistered decisions are concerning, and in some cases deviating from a plan may be much better than sticking to it, Vazire said in an email.  

“Part of the purpose of peer review is to evaluate these types of things: How concerning are the researcher degrees of freedom in this study? How well-calibrated are the authors’ conclusions to the risk of bias in their study?,” she said.  

Reviewers and editors will take the same care to evaluate these issues when studies are not preregistered. 

“This is crucial, because if only preregistered studies are scrutinized for researcher degrees of freedom, researchers will be incentivized not to preregister,” Vazire said. “Thus, to reward authors who take the risk of making their deviations and flexibility visible (by showing us their preregistered plans), we must also scrutinize the opportunities for flexibility and researcher degrees of freedom when there is no preregistration.” 

The ultimate aim of the changes is to promote preregistration as default when appropriate, rather than a variably used option, for the studies it reviews and publishes, the editors say.  

* Psychological Science is not eliminating badges altogether. The journal is introducing a new Computational Reproducibility Badge, awarded to authors who take the necessary steps to ensure that reported results can be independently reproduced, within a reasonable time-frame, by a new team of Statistics, Transparency, and Rigor (STAR) editors.  

Emily C. Willroth of Washington University in St. Louis and Olivia E. Atherton of University of Houston provide a new reporting framework for scientists who alter their research plan after preregistering a study.  

“This framework provides a clear template for what to do when things do not go as planned,” the authors wrote. “We encourage researchers to adopt this framework in their own preregistered research, and for journals to implement structural policies around the transparent reporting of preregistration deviations.” 

Willroth and Atherton cited the various reasons scientists change their plans, including unforeseen logistical problems, reviewer requests for additional data, or the emergence of new research practices postregistration. But scientists have no standards for clearly and completely reporting those deviations to reviewers, editors, and readers.  

“Even the best laid plans don’t work out sometimes,” they wrote. “Preregistration can still be a valuable tool for increasing the credibility of scientific findings, so long as preregistration deviations are transparently reported.” 

Preregistration was never meant to irreversibly lock scientists into a research plan. But it helps clarify which hypotheses and analyses researchers specified before data collection and which were more exploratory and driven by the data. APS’s flagship journal Psychological Science began incentivizing preregistration as part of its standards a decade ago, and Clinical Psychological Science and AMPPS have followed that practice. 

However, research has uncovered the frequency of deviations from preregistered plans. A team of scientists in the Netherlands, for example, examined 459 preregistered studies and found that more than half of the articles omitted or added hypotheses after the original plans were submitted (van den Akker et al., 2023).  

Scholars at KU Leuven in Belgium examined 23 papers published in Psychological Science and found that all but two deviated from the preregistration plan and only one fully disclosed all the deviations. The nature of the disparities varied from inconsistent terminology to post hoc changes in statistical analyses (Claesen et al., 2021).  

Willroth and Atherton said they’ve struggled to find the best way to report deviations in their own preregistered work and to identify deviations in papers they’ve read as peer reviewers. They developed their proposed templates on the basis of their own experiences as well as feedback in the review process. They also surveyed 34 psychology journal editors, who reported that roughly 44% of the manuscripts they handled in the prior 2 years were preregistered.  Overall, the editors said that reported deviations would not significantly change their perceptions about the manuscript. But they said they would react negatively to deviations that were unreported and only identified in the review process. Some said their opinions would depend on such factors as the reasons for the change, the impact on the effect size, and the sheer number of deviations.  

What’s more, the editors said they personally only evaluated researchers’ adherence to their preregistered plans about 65% of the time. Nearly one quarter of respondents said their author guidelines include instructions for reporting deviations, while 35% said they had no formal mechanisms to check for deviations. 

Willroth and Atherton shared their template with the survey participants and adjusted it according to the feedback. The editors, on average, said they would probably support a policy requiring authors to use the template for reporting deviations. 

The template calls for researchers to report the type of deviation, as well as the reason for and timing of the change. The template includes a table where researchers can report unregistered steps. It enables researchers to report their original plan, a description of the deviations, and the impact on readers’ interpretation of the results. They could report, for example, changes in the planned sample size or the addition of a new hypothesis.  

The authors also recommended actions that researchers can take to prevent deviations, including reviewing relevant literature on the planned analytical approach prior to preregistering; including sufficient detail in the study plan to reduce the likelihood of taking a step that wasn’t included in the preregistration; documenting and registering standard operating procedures for the lab or for a given dataset; and having collaborators provide feedback on preregistrations in the same way they would a final manuscript. 

Willroth and Atherton also called on journals to implement policies that encourage transparent reporting of preregistration deviations.  

“The adoption of this framework … will alleviate burden on reviewers and editors,” they wrote, “and will increase the transparency and credibility of preregistered research.” 

Feedback on this article? Email [email protected] or login to comment.


Claesen, A., Gomes, S., Tuerlinckx, F., & Vanpaemel, W. (2021). Comparing dream to reality: An assessment of adherence of the first generation of preregistered studies. Royal Society Open, 8(10), Article 220137. https://doi.org/10.1098/rsos.211037 

van den Akker, O. R., van Assen, M. A. L. M., Enting, M., de Jonge, M., Ong, H. H., Rüffer, F., Schoenmakers, M., Stoevenbelt, A. H., Wicherts, J. M., & Bakker, M. (2023). Selective hypothesis reporting in psychology: Comparing preregistrations and corresponding publications. Advances in Methods and Practices in Psychological Science, 6(3), 1–15. https://doi.org/10.1177/25152459231187988 

Willroth EC, Atherton OE. Best Laid Plans: A Guide to Reporting Preregistration Deviations. Advances in Methods and Practices in Psychological Science. 2024;7(1). doi:10.1177/25152459231213802

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.