Do “Brain-Training” Programs Work?
Psychological Science in the Public Interest (Volume 17, Number 3)
Read the Full Text (PDF, HTML)
Feel like your concentration is slipping? Want to shore up your problem-solving skills? Interested in preventing general age-related cognitive decline? If this describes you, then the brain-training industry has a solution . . . or does it?
The brain-training industry is a multibillion-dollar enterprise that has risen based on the promise that playing simple cognitive games can improve a wide variety of cognitive skills used in daily life. In the current issue of Psychological Science in the Public Interest (Volume 17, Number 3), psychological scientist Daniel J. Simons and colleagues review research that calls into question these claims.
Many brain-training programs are based on the idea that practicing simple cognitive skills in a limited context will lead to improvements on a wide range of skills in everyday life. But what does the research say? Simons and his colleagues examined the findings of studies published in peer-reviewed scientific journals, including those cited on the websites of leading brain-training companies.
Their review of the literature found that brain-training tasks seem to improve performance on the trained tasks themselves; however, there is less evidence that cognitive training improves performance on closely related – but not identical – tasks, and very little evidence that it improves performance on distantly related tasks or improves everyday cognitive performance. In reviewing the literature, the authors found that many of the studies suffered from methodological problems and did not conform to best practices for research.
So where do we go from here? Simons and colleagues provide recommendations for researchers, funding agencies, journalists, policymakers, and the public. The recommendations for researchers center on strengthening the methodology and analysis plans of brain-training studies. Preregistering studies, adequately randomizing the assignment of subjects to treatment and comparison groups, using sufficient sample sizes, correcting for multiple comparisons, and acknowledging conflicts of interest are just some of the best practices the authors recommend for the field.
Brain Training Pessimism but Applied-Memory Optimism
By Jennifer A. McCabe (Goucher College), Thomas S. Redick (Purdue University), and Randall W. Engle (Georgia Institute of Technology)
Read the Full Text (PDF & HTML)
Funding organizations need to demand that studies be preregistered, run using best practices, and fully and transparently reported. They must also commit to providing adequate funds for the rigorous, large-scale studies this area of research requires, while weighing opportunity costs relative to other interventions. Journalists who cover brain-training research should confirm that the claims made about studies match the evidence they provide and be critical in their coverage so that the public is not misled about the findings of such research.
Many of those who are most enticed by the claims of brain-training companies are those who are the most vulnerable: children and adults with cognitive deficits, adults experiencing cognitive decline, and those with mental health issues. Policymakers need to more critically evaluate the claims of brain-training companies and require more rigorous standards of evidence for the benefits of these programs.
Finally, consumers need to carefully weigh the costs and benefits of using brain-training programs. If a person’s goal is to improve performance on a specific trained task, then brain training may help with that goal. If a person wishes to improve a more general set of cognitive skills, then brain training is most likely not the way to do it. The public needs to be skeptical of brain-training programs and consider the quality of the science behind companies’ claims.
In a commentary accompanying this report, Jennifer McCabe, Thomas Redick, and Randall Engle offer alternatives to popular brain-training programs, identifying low-cost strategies for improving memory – strategies backed by decades of applied memory research.
Studies have shown that people remember information better when they encode it in several different ways and make meaningful connections to the material. “Elaboration,” through imagery, mnemonic devices, and the creation of personal connections to to-be-remembered material, enhances recall. “Testing” oneself – rather than merely rereading information – also supports longer-term learning and memory, and taking breaks between study sessions (a strategy known as “spacing”) instead of cramming studying into one marathon session improves memory for studied information.
The findings presented by the commentary authors indicate that there are effective methods for improving memory that are grounded in research. Unfortunately for the many people interested in popular traditional brain-training programs, this review shows that these programs generally fall short of their advertised effectiveness and that people may profit more from adopting better-supported alternatives for improving cognitive performance.
1 Department of Psychology, University of Illinois at Urbana-Champaign; 2 Department of Psychology, Florida State University; 3 Institute for Successful Longevity, Florida State University; 4 Medical Research Council Cognition and Brain Sciences Unit, Cambridge, UK; 5 School of Clinical Medicine, University of Cambridge; 6 Department of Psychology, Union College; 7 Geisinger Health System, Danville, PA; 8 Department of Psychology, Michigan State University; 9 Department of Educational Psychology, University of Illinois at Urbana-Champaign; 10 Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.