The leading journal in psychological science is introducing innovative new guidelines for authors, part of an effort to strengthen the reporting and analysis of findings in psychological research.
The new author guidelines for the journal Psychological Science are among several related initiatives that researchers, led by the Association for Psychological Science, are undertaking to promote the replicability of scientific studies and the use of sound research practices across all areas of the field.
In March 2013, for example, APS announced the launch of the Registered Replication Reports Initiative in the journal Perspectives on Psychological Science, edited by Barbara Spellman, University of Virginia, with the aim of providing a venue for the publication of organized, multi-center attempts to replicate important psychological research. In recognition of these efforts, the Center for Open Science, with support from the Laura and John Arnold Foundation, awarded APS a grant of $250,000 to fund qualified labs to participate in the initiative.
And APS is committed to supporting infrastructure, such as the Open Science Framework established by the Center for Open Science, that facilitates change both within and across scientific disciplines.
The new submission guidelines for Psychological Science build upon these groundbreaking initiatives. As Editor in Chief Eric Eich of the University of British Columbia notes in an editorial and interview, the guidelines are aimed at enhancing the reporting of research methods and promoting robust research practices.
Starting January 1, 2014, submitting authors will be required to state that they have disclosed all important methodological details, including excluded variables and additional manipulations and measures, as a way of encouraging methodological transparency.
To make it easier for authors to comply with this requirement, the editors will no longer enforce strict word limits on the Method and Results sections of the most common types of articles, Research Articles and Research Reports.
Psychological Science will also serve as a launch vehicle for a program to promote open communication within the research community by recognizing authors who have made data, materials, and/or preregistered design and analysis plans publicly available with specific “badges.”
In addition, the journal will encourage authors to use the “new statistics” of effect sizes, confidence intervals, and meta-analyses in an effort to avoid problems typically associated with null-hypothesis significance testing. To support this approach, the journal has published a statistics tutorial by Geoff Cumming of La Trobe University in Australia. “The New Statistics: Why and How” is freely available online.
Together, these efforts represent an innovative approach to encouraging and supporting sound science that, among other things, is seen as a model for the broader scientific community as it grapples with issues of replicability and transparency.
“The initiatives we’ve talked about are, at most, steps in the right direction, not an ideal end state,” says Eich. “The issues of replicability and research practices are complex but not intractable if the community at large gets involved.”