Practice

Three Cutting-Edge Approaches to Addressing Critical Issues in Meta-Analyses

The March issue of Advances in Methods and Practices in Psychological Science concludes a special focus on multilevel modeling and meta-analysis begun in the September issue, and includes three articles exploring novel approaches to enhancing the rigor of meta-analyses.

In Advancing Meta-Analysis With Knowledge-Management Platforms: Using metaBUS in Psychology, Frank Bosco (Virginia Commonwealth University), James G. Field (West Virginia University), Kai R. Larsen (University ofColorado Boulder), Yingyi Chang (Virginia Commonwealth University), and Krista Uggerslev (Northern Alberta Institute of Technology) introduce an online interactive tool that enables researchers to search from more than a million research results and obtain instant meta-analytical data. MetaBUS relies on standards-based protocols in combination with human coding to organize and provide an accessible database of research findings, offering the potential to advance research and education in psychological science, the researchers say.

In Enriching Meta-Analytic Models of Summary Data: A Thought Experiment and Case Study, Blakeley B. McShane and APS Fellow Ulf Böckenholt (Northwestern University) pose this question: What if, even when only summary data are available, meta-analysts acted as though they possessed individual-level data from each study and considered the model specifications these data might fit? This thought experiment could allow researchers to better understand the complexity of the data they are analyzing and move toward richer summary-data approaches when the complexity of the data warrants it. The authors present cases in which the common meta-analytic approach is appropriate, such as when trying to understand the overall effect on a single dependent variable in a single group, measured in multiple studies. And they present cases that warrant different approaches, including multilevel modeling, such as when trying to understand effects in multiple dependent variables, in multiple groups and covariates.

In Obtaining Unbiased Results in Meta-Analysis: The Importance of Correcting for Statistical Artifacts, Brenton M. Wiernik (University of South Florida) and Jeffrey A. Dahlke (Human Resources Research Organization, Alexandria, Virginia), provide the R code to correct artifacts that can bias the results of individual studies and meta-analyses. Artifacts—including variance due to sampling error, unreliability of measurements, and range restrictions—can bias the results of individual studies and meta-analyses, leading to inaccurate conclusions about mean effect sizes and heterogeneity of studies in a meta-analysis. The researchers also describe how to estimate the effects of these artifacts in different research designs and correct for their impact.

In an accompanying editorial, Frederick L. Oswald (Rice University) and APS Fellow Jennifer L. Tackett (Northwestern University) emphasize the importance of practical guidance and future-oriented thinking for the advancement of multilevel modeling and meta-analytical research. These three approaches to meta-analysis show not only the need for improving the researchers’ approaches to complex data but also how advances in technologies, analytic methods, and open science practices might shape the future of meta-analysis.


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.