Putting Psychological Science to the Test: Transparency and Reproducibility-Related Research Practices (2014-2017)

Concerns about the credibility of some scientific research have prompted calls for adopting research practices that could enhance researchers’ ability to assess and reproduce scientific findings. These practices, which are aimed at enhancing transparency and reproducibility, could also increase credibility, facilitate self-correction, and reduce the number of misleading findings.

In a recent article in Perspectives on Psychological Science, Tom E. Hardwicke of the University of Amsterdam and colleagues estimated the prevalence of transparency and reproducibility-related research practices across psychology studies published between 2014 and 2017.

The authors used previous investigations in biomedicine and the social sciences to guide their approach to estimating several indicators of transparency and reproducibility. They started by using a random-number generator to select a sample of 250 articles from the 224,556 articles found in the Scopus database, the largest abstract and citation database of peer-reviewed literature. Their initial search included 1,323 articles from 33 countries, in areas such as clinical psychology, neuropsychology, and applied psychology.

For each article, Hardwicke and colleagues coded the characteristics of the study (including its area, subjects, and design) and extracted data regarding the following transparency and reproducibility-related practices:

  • Article availability (whether it was open access)
  • Materials and protocol availability
  • Data availability
  • Analysis-script availability
  • Preregistration
  • Funding and conflict-of-interest statements

Of the 250 randomly selected articles, 237 were in English, 65% of which were publicly available (open access). Almost three-quarters (183) involved primary data, only 26 (14%) of which contained a statement regarding the availability of original research materials, such as survey instruments, software, or stimuli. However, seven of these articles had broken links, so these materials were not actually available. No articles provided study protocols.

Of the 188 articles that involved primary or secondary data, only four contained data-availability statements. One data set required a fee to obtain access, and another one appeared incomplete. Only one of the 188 articles shared an analysis script. Five included statements about preregistration; these contained information about hypotheses and methods but not analysis plans.

Regarding funding and conflict-of-interest statements, 62% of the articles included a statement about funding sources, and 39% included a conflict-of-interest statement. Of these, 86% stated no conflicts of interest.

These findings suggest that “although there is evidence that some individual methodological reform initiatives have been effective in specific situations … their collective, broader impact on the psychology literature during the examined period was still fairly limited in scope,” Hardwicke and colleagues wrote.

The authors added that they recognize the caveats and limitations of their study, given its reliance on a random sample of 250 articles, and noted that their estimations might not generalize to specific contexts (e.g., articles published in specific journals). However, beyond providing an estimate of transparency and reproducibility-related practices, the study also signals that even when these practices are adopted, they might not be correctly implemented (e.g., broken links to shared materials).

Hardwicke and colleagues propose that future studies add a temporal dimension by comparing new data with the baseline established here, highlighting the evolution of the adoption of transparency and reproducibility-related practices in psychological science.

Works cited 

Hardwicke, T. E., Thibault, R. T., Kosie, J. E., Wallach, J. D., Kidwell, M. C., & Ioannidis, J. P. A. (2021). Estimating the Prevalence of Transparency and Reproducibility-Related Research Practices in Psychology (2014–2017). Perspectives on Psychological Science. Advance online publication. https://doi.org/10.1177/1745691620979806


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.