Scientists Propose Upgrades to Research-Methods Education for Psychology Students 

Many undergraduate psychology courses fail to ensure students fully understand research design and analysis. An international team of psychological scientists have recommended some systemic steps to remedy that shortcoming.  

Researchers from the United Kingdom and Canada outline these recommendations in an article published in Advances in Methods and Practices in Psychological Science (AMPPS). Their recommendations are based on a survey of stakeholders, including instructors, undergraduate and graduate students, and nonacademic psychologists. The scientists, led by Robert Thibault of the Meta-Research Innovation Center at Stanford University, embarked on the study to help the British Psychological Society update its standards for accrediting psychology programs. But other accrediting bodies, as well as program directors and instructors, can draw on the findings to set standards for teaching research methods, they wrote.  

“Such initiatives could foster cohorts of graduates with an established set of competencies tuned for the contemporary world,” they concluded.  

The effort to upgrade instruction standards for research methods emanates from the rising focus on rigor and the adoption of open science practices. These advances are poorly reflected in psychology curricula, which have seen few updates over the past 2–3 decades, research has shown. One study, for example, found that few courses focus on effect sizes, confidence intervals, and alternatives to null-hypothesis significance testing, which has shortcomings that many scientists blame for the replication problems in psychological science. 

“Taken together, the time is ripe to modernize the teaching of quantitative and qualitative research methods in psychology programs,” the authors said.  

For the project, Thibault and his collaborators used the Delphi technique—a structured method of eliciting and aggregating opinions. They collected anonymous responses from more than 100 stakeholders to determine the level of consensus around methods instruction. The participants, including individuals from more than 50 universities in the United Kingdom, were asked their opinions about specific content to teach as well as approaches to teaching it. The aim was to address the knowledge and skills gaps that lead to irreproducible research and to ensure graduates develop data skills that are useful in nonacademic careers. 

The recommendations for methods instruction are as follows: 

  • Require a strong understanding of data and quantitative data skills. 
  • Emphasize general skills in research design. 
  • Prioritize a foundation in descriptive statistics. 
  • Provide students with a framework for critically assessing research claims. 
  • Raise the prominence of qualitative methods in accreditation standards. 
  • Require that parameter-estimation techniques, such as confidence intervals and effect sizes, be taught alongside significance testing. 
  • Prioritize the teaching of foundational skills in research methods.  
  • Promote content that shows how research-methods skills can transfer beyond academia. 
  • Focus on fewer skills in greater depth and offer optional models for advanced methods skills.  

Thibault and his team cited limitations with their work, including sparse participation by students, nonacademic psychologists, and those who use qualitative methods. But they noted that their use of the Delphi technique allowed them to garner a robust understanding of participants’ opinions about instruction in research methods.  

Feedback on this article? Email [email protected] or login to comment.

Reference 

Thibault, R. T., Bailey-Rodriguez, D., Bartlett, J. E., Blazey, P., Green, R. J., Pownall, M., & Munafo, M. R. (2024). A Delphi study to strengthen research-methods training in undergraduate psychology programs. Advances in Methods and Practices in Psychological Science, 7(1). https://doi.org/10.1177/25152459231213808 


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.