The Lie Detector

Since the birth of scientific psychology some 130 years ago, psychologists have grappled with the best ways to collect and interpret data. And although the field has made incremental progress over the past century or so, APS Fellow & Charter Member Frank Schmidt, an industrial/organizational psychologist at the University of Iowa, believes there is much more room for improvement.

Schmidt described how data can easily mislead researchers during his James McKeen Cattell Award Address “How to Detect and Correct the Lies That Data Tell” at the APS 20th Annual Convention.

Schmidt is quite serious when he says that data lie. It’s not so much that the data are invalid but that they instead create an inaccurate façade that hides simple, parsimonious truths about human behavior.

“There is a naive cult of overconfident empiricism and an excessive faith in the value of data as a direct source of scientific truths,” said Schmidt, adding that unfortunately, many fields have failed to acknowledge that findings are being misinterpreted, appearing more tortuous than they actually are.

He is especially perturbed about the old adage “let the data speak” that gets thrown around in laboratories. “The injunction to ‘just let the data speak’ is very naive and deceptive. Data can look you right in the eye and lie to you without blinking.”

Schmidt knows a thing or two about these lies. For some time, research in industrial/organizational psychology had produced fractured results that suggested personnel tests were idiosyncratic to person, time, or setting. But Schmidt’s work has been able to demonstrate that conflicting research findings about the predictive validity of these measures are almost entirely due to statistical artifacts, most notably sampling and measurement error.

And of course there are other ways that data can lie to you as well. Schmidt has helped develop statistical methods to correct for data errors (typos, coding errors, and so on), range restriction, the dichotomization of measures, and imperfect construct validity. He urges that if researchers are unable to correct some of the artifacts that potentially distort their findings, they should disclose this in their research reports.

To avoid data pitfalls in individual studies, Schmidt advises that researchers use confidence intervals and point estimates rather than significance tests and make as many changes as appropriate to individual correlations.

When integrating results across studies, Schmidt advocates meta-analysis. However, the pervasive use of inappropriate meta-analysis models may be thwarting attempts to discover a clear picture of research findings. In a study to be published in the British Journal of Mathematical and Statistical Psychology, Schmidt and colleagues examined 199 meta-analyses appearing in Psychological Bulletin from 1978 to 2008, and found that 79 percent of the studies were fixed effects models as opposed to the more desirable random effects models. The combined result is that the strength of relationships is underestimated while the precision of these estimates is greatly overestimated.

“This isn’t just technical nitpicking,” said Schmidt. “These are big differences. They have important implications for our development of cumulative knowledge, our development of theories.”

Schmidt’s current research has focused on improving methods for correcting relationships for measurement error, which has led to what he describes as “surprising findings.” For example, certain constructs thought to be conceptually distinct turn out to be empirically indistinguishable; job satisfaction turns out to be the empirical equivalent of organizational commitment. Psychologists may believe there are characteristics unique to these variables but “the people responding to the questionnaires don’t make that distinction,” according to Schmidt.

His new research is merely another step towards revealing “a picture of greater simplicity emerging from the appearance of complexity. That’s the goal in science, to find the simple, deep structure underlying the complex surface structure.”

The James McKeen Cattell Fellow Award recognizes APS Members for a lifetime of outstanding contributions to the area of applied psychological research. For more information about the award and a list of past recipients, see www.psychologicalscience.org/awards. ♦


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.