- Mission Statement
- Article Type Description
- Funding Opportunity
- Instructions for Authors
- Instructions for Reviewers
- Ongoing Replication Projects
Replicability is a cornerstone of science. Yet replication studies rarely appear in psychology journals. The new Registered Replication Reports article type in Perspectives on Psychological Science fortifies the foundation of psychological science by publishing collections of replications based on a shared and vetted protocol. It is motivated by the following principles:
• Psychological science should emphasize findings that are robust, replicable, and generalizable.
• Direct replications are necessary to estimate the true size of an effect.
• Well-designed replication studies should be published regardless of the size of the effect or statistical significance of the result.
Traditional psychology journals emphasize theoretical and empirical novelty rather than reproducibility. When journals consider a replication attempt, the process can be an uphill battle for authors. Given the challenges associated with publishing replication attempts, researchers have little incentive to conduct such studies in the first place. Yet, only with multiple replication attempts can we adequately estimate the true size of an effect.
A central goal of publishing Registered Replication Reports is to encourage replication studies by modifying the typical submission and review process. Authors submit a detailed description of the method and analysis plan. The submitted plan is then sent to the author(s) of the replicated study for review. Because the proposal review occurs before data collection, reviewers have an incentive to make sure that the planned replication conforms to the methods of the original study. Consequently, the review process is more constructive than combative. Once the replication plan is accepted, it is posted publicly, and other laboratories can follow the same protocol in conducting their own replications of the original result. Those additional replication proposals are vetted by the editors to make sure they conform to the approved protocol.
The results of the replication attempts are then published together in Perspectives on Psychological Science as a Registered Replication Report. Crucially, the results of the replication attempts are published regardless of the outcome, and the protocol is predetermined and registered in advance. The conclusion of a Registered Replication Report should avoid categorizing each result as a success or failure to replicate. Instead, it should focus on the cumulative estimate of the effect size. Together with the separate results of each replication attempt, the journal will publish a figure illustrating the measured effects from each study and a meta-analytic effect size estimate. The details of the protocol, including any stimuli or code provided by the original authors or replicating laboratories as well as data from each study, will be available on the Open Science Framework (OSF) website and will be linked from the published report and the APS website for further inspection and analysis by other researchers. Once all the replication attempts have been collected into a final report, the author(s) of the original article will be invited to submit a short, peer-reviewed commentary on the collection of replication attempts.
This publication model provides many broader benefits to psychological science:
- Because the registered replication attempts are published regardless of outcome, researchers have an incentive to replicate classic findings before beginning a new line of research extending those findings.
- Subtleties of methodology that rarely appear in method sections of traditional journals will emerge from the constructive review process because original authors will have an incentive to make them known (i.e., helping to make sure the replications are designed properly).
- Multiple labs can attempt direct replications of the same finding, and all such replication attempts will be interlinked, providing a cumulative estimate of the true size of the effect.
- The emphasis on estimating effect sizes rather than on the dichotomous characterization of a replication attempt as a success or failure based on statistical significance could lead to greater awareness of the shortcomings of traditional null-hypothesis significance testing.
- Authors and journalists will have a source for vetted, robust findings, and a stable estimate of the effect size for controversial findings.
- Researchers may hesitate to publish a surprising result from a small-sample study without first verifying that result with an adequately powered design.
A Registered Replication Report consists of a collection of independently conducted, direct replications of an original study, all of which follow a shared, predetermined protocol. The collection of replications will be published as a single article in Perspectives on Psychological Science, and all researchers contributing replications will be listed as authors. The initial submission will be only the plan (as the results will not have been collected yet), but the final publication will include the following:
- A brief introduction explaining the importance of the original study and the reason why a more precise, cumulative estimate of size and robustness of the reported effect will benefit the field.
- A detailed description of the shared protocol used by all replication teams.
- A figure showing the effect sizes measured by each replication team, along with a meta-analytic estimate of the effect size. (This figure will be generated by the editors, in consultation with experts in meta-analytic techniques.)
- Brief descriptions of the results and analyses for each individual replication attempt (written separately by each replication team).
- A brief discussion of the cumulative findings.
The author of the original article that was the focus of the collected replications will be offered an opportunity to submit a short, peer-reviewed commentary on the Registered Replication Report. The published report will link to more extensive reports from each replicating lab on the Open Science Framework website, and all replicating labs are expected to post the data from their replication attempts. Additional replications completed after the initial registered replication report appears in print should be posted on the Open Science Framework, and those results may be incorporated into a meta-analytic effect size estimate published in Perspectives.
The Center for Open Science, with support from the Laura and John Arnold Foundation, has contributed a $250,000 grant to the Registered Replication Report initiative. This fund is dedicated exclusively to providing support for qualified labs that wish to participate in a replication project and can cover costs such as:
- Subject testing fees
- Trained research assistants
- Trained coders
- Expendable materials
- Other materials
- Scan time
- Access to equipment
- Translations of protocols, materials, etc.
- Consulting on meta-analysis for final report or on analysis as a whole
- Programming of experimenter scripts to be used by participating labs
- Programming of analysis scripts in R
Interested researchers will have the opportunity to discuss potential funding with the editors during the replication proposal stage.
Detailed instructions for authors can be found here.
Detailed instructions for reviewers can be found here.
A list of ongoing replication projects and instructions for joining a project can be found here. You can also get announcements and updates from the editors by joining the Registered Replication Report Google group.