<em>Perspectives</em> Provides Strategies for Maximizing Informational Value of Research

This is an image of the cover of Perspectives on Psychological Science.It’s an exhilarating time in psychological science, as momentum continues to build toward improving research standards and practices across the field.

A special section in the November issue of Perspectives on Psychological Science is part of an ongoing effort to involve researchers in this movement by providing a set of cutting-edge strategies that can be used to improve the way research is conducted and evaluated.

According to psychological scientist Alison Ledgerwood, associate professor at the University of California, Davis and editor of the special section, this new collection of articles builds on the foundation laid down in a special section published in the May 2014 issue of Perspectives, providing researchers with a concrete toolkit for enhancing their research:

“Together, these articles emphasize the importance of thinking about how we can boost the information provided by a given study and how we can synthesize studies in a way that lets us learn as much as possible from them,” Ledgerwood writes in her introduction to the section. “Each article provides concrete strategies that we can implement—either as individual researchers or together as a field—to maximize the knowledge we get from the work that we do.”

Learn more about these cutting-edge strategies in the following articles, which are available free to the public.


Perspectives on Psychological Science

Special Section: Moving Toward a Cumulative Science: Maximizing What Our Research Can Tell Us  


Introduction

Alison Ledgerwood

In May 2014, Perspectives on Psychological Science published a Special Section that provided practical guidance on best practices to increase and judge the informational value and quality of research. In a continuation of this theme, the following articles provide practical advice that researchers can use to get the most accurate picture of their results and to synthesize findings across studies.


You Cannot Step Into the Same River Twice: When Power Analyses Are Optimistic

Blakeley B. McShane and Ulf Böckenholt

Statistical power depends on the size of the effect of interest; however, even under nearly identical replication conditions, there is often between-study variation in effect sizes. Standard formulas for calculating power ignore this variation, leading to an overestimation of the power of many studies. A new formula that takes into account this variation can help researchers calculate power and set sample sizes for future studies.


Sacha D. Brown, David Furrow, Daniel F. Hill, Jonathon C. Gable, Liam P. Porter, and W. Jake Jacobs

Researchers often do not adequately describe the sampling decisions made in their studies, which can make it difficult for others to replicate their findings. A new tool — the Replicability and Meta-Analysis Suitability Inventory — can be used to evaluate the descriptive adequacy of published manuscripts and to help guide sampling decisions made during study construction and during the peer-review process.


Beyond Power Calculations Assessing Type S (Sign) and Type M (Magnitude) Errors

Andrew Gelman* and John Carlin

Researchers often rejoice when they find a significant result, but what does this result actually mean? And how much should we trust it? When small sample sizes are used to study small effects, there is a surprisingly high likelihood that the significant result will be in the wrong direction and that the effect will be greatly overestimated. Two new design calculations gauge these errors, helping researchers judge the trustworthiness of their findings.

*Andrew Gelman will be speaking in an invited symposium at the 27th APS Annual Convention in New York, NY, USA.


Analytic Review as a Solution to the Misreporting of Statistical Results in Psychological Science

John Sakaluk, Alexander Williams, and Monica Biernat

Research suggests that one-third to one-half of published articles contain some type of statistical reporting error. An analytical review process could help decrease the prevalence of misreported findings. Although manuscripts would still be vetted for theory, methodology, and importance of contribution, they would also be reviewed to ensure that the reported analyses and statistical values are accurate.


Community-Augmented Meta-Analyses: Toward Cumulative Data Assessment

Sho Tsuji, Christina Bergmann, and Alejandrina Cristia

Community augmented meta-analysis (CAMA) is a new tool that combines aspects of a meta-analysis and an open-access database. Researchers collect and code relevant research on a specific topic and include it in a database, which is made public. Researchers can then use, and add to, the existing collection of research. CAMA can help researchers collect and integrate past research and can provide a platform for previously unpublished “file-drawer studies.”


p-Curve and Effect Size: Correcting for Publication Bias Using Only Significant Results

Uri Simonsohn, Leif D. Nelson, and Joseph P. Simmons

Because of publication bias, the use of published research to calculate effect sizes can lead to the production of an inflated effect size estimate. Estimating effect sizes using a p-curve — the distribution of statistically significant p values across studies — can provide a better effect size estimate than traditional techniques when nonsignificant findings are suppressed and when p-hacking has occurred.


What Lies Within: Superscripting References to Reveal Research Trends

Eric M. Anicich

One way to judge the contributions of an article to the field at large is by examining the number of times a specific article has been cited; however, this method lumps both confirmatory and contradictory findings into the same metric. A new method for indicating how cited research relates to current findings could help researchers situate new findings within the broader literature.

 


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.