Brain-Training Claims Not Backed by Science, Report Shows

The companies behind many popular brain-training games and apps cite a variety of scientific studies as evidence that their products improve cognition in daily life. A research report puts those claims to the test, providing a comprehensive review of the studies cited by brain-training proponents and companies. While people may improve on the specific tasks they practice, the researchers conclude that there is no compelling scientific evidence that computerized brain-training programs yield broader cognitive benefits or improve real-world outcomes for their users.

The analysis and an independent commentary on the findings appear in the journal Psychological Science in the Public Interest, a journal of the Association for Psychological Science.

“The idea behind ‘brain training’ is that if you practice a task that taps a core component of cognitive ability, like memory, the training will improve your ability to perform other tasks that also rely on memory, not just in the lab, but also in the world,” explained psychological scientist Daniel Simons of the University of Illinois, who led the analyses.

“If you practice remembering playing cards, you’ll get really good at remembering playing cards,” Simons said. “But does that help you remember which medications to take, and when? Does it help you remember your friends’ names? Historically, there is not much evidence that practicing one task improves different tasks in other contexts, even if they seem to rely on the same ability.”

Simons and colleagues Walter Boot and Neil Charness (Florida State University), Zachary Hambrick (Michigan State University), Christopher Chabris (Union College and Geisinger Health System), Susan Gathercole (Medical Research Council, Cambridge, UK), and Elizabeth Stine-Morrow (University of Illinois) closely examined 132 journal articles cited by a large group of brain-training proponents in support of their claims. The team supplemented that list with all of the published articles cited on the websites of leading brain-training companies that were identified by SharpBrains, an independent market-research firm that follows the industry.

The review found numerous problems with the way many of the cited studies were designed and how the evidence was reported and interpreted. The problems included small sample sizes and studies in which researchers reported only a handful of significant results from the many measures collected.

“Sometimes the effects of a single brain-training intervention are described in many separate papers without any acknowledgment that the results are from the same study,” Simons said. “That gives the misleading impression that there is more evidence than actually exists, and it makes it hard to evaluate whether the study provided any evidence at all.”

Some studies conducted with special groups (such as people diagnosed with schizophrenia, children with language delays, or older adults with dementia) were used as support for broad claims about the benefits of brain training for the general population.

One of the most glaring problems in the cited research was the use of inadequate control groups as a baseline for measuring improvements. Ideally, participants in a control group do not engage in the intervention but are otherwise matched closely with those who do, the researchers said. Not only should the control group’s demographics (age, sex, race, income and education) match those of the intervention group as closely as possible, control-group participants also should be equally engaged, Simons said.

“A control group should experience everything the treatment group does, except for the critical ingredient of the treatment,” he said. “They should be equally engaged and should have similar expectations for improvement, so that if the treatment group improves more than the control group, the difference must be due to the treatment itself.”

Some of the studies had no control group. Some had a passive control group, whose members took the same pre- and post-test as the intervention group, but were not engaged in any other way. Some studies had participants in a control group come into the lab and play crossword puzzles, watch educational DVDs or just socialize with the experimenters. Such control groups differ in many ways from the intervention group, so greater improvement in the treatment group might be due to those other differences, including differences in expected improvement, rather than to the brain-training intervention itself, the researchers said.

Most of the cited research tested for improvements on simplified, abstract laboratory tasks rather than on measures of real-world performance.

“There are relatively few studies in this literature that objectively measure improvements on the sorts of real-world tasks that users of the programs presumably want to improve – and that the programs’ marketing materials emphasize,” Simons said.

“Based on our comprehensive review of the evidence cited by brain-training proponents and companies, we found little evidence for broad transfer from brain-training tasks to other tasks,” Simons said. “We hope future studies will adopt more rigorous methods and better control groups to assess possible benefits of brain training, but there is little evidence to date of real-world benefits from brain training.”

In a commentary accompanying the main report, researchers Jennifer A. McCabe (Goucher College), Thomas S. Redick (Purdue University), and Randall W. Engle (Georgia Institute of Technology) examine evidence for other interventions that may improve cognitive functioning. Although there is little evidence that skills practiced in brain-training games transfer to other real-world tasks, other learning strategies are backed by decades of scientific research. McCabe, Redick, and Engle highlight three techniques — elaborating on material, repeated testing, and spaced studying – as examples of evidence-based procedures for improving memory.

Comments

I ofte see reading as a positive way as helping with our cognition.
Is there research on that?


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.