Convergence: Connecting Levels of Analysis in Psychological Science
 In the past, our field harbored distinct, and often competing, schools of thought that tackled different problems and produced findings that often appeared to diverge. Today, investigators attack shared problems at complementary levels of analysis and produce results that converge. Studies of people in a social world; mental systems of cognition and emotion; and biological mechanisms of the genome and the nervous system interconnect and yield an integrated psychological science. The APS 23rd Annual Convention displays, and celebrates, these advances in our field.


False-Positives Are Frequent, Findable, and Fixable

Saturday, May 26, 2012, 2:30 PM - 3:50 PM

Chair: Joseph Simmons
University of Pennsylvania
Chair: Leif D. Nelson
University of California, Berkeley

Psychology has a big problem. It is too easy to “discover” and publish “evidence” for false effects. We present evidence that the problem exists (and that it’s probably worse than you think), explain its causes, offer a low-cost solution, and describe a new technique for identifying false literatures.

Measuring the Prevalence of Questionable Research Practices with Incentives for Truth-Telling
Leslie John
Harvard University
We surveyed psychologists about their involvement in questionable research practices, using an anonymous elicitation format and incentives for honest reporting. Although cases of clear scientific misconduct have received significant press, this research suggests that less-flagrant transgressions may be more prevalent and, in the long run, more damaging to academia.

Co-Author: George Loewenstein, Carnegie Mellon University

Co-Author: Drazen Prelec, Massachusetts Institute of Technology

False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Anything to Be Presented as Significant
Joseph Simmons
University of Pennsylvania
Despite our field’s commitment to minimizing false-positive findings (p=.05), flexibility in data collection, analysis, and reporting dramatically increases actual false-positive rates. We show how unacceptably easy it is to accumulate (and report) statistically significant evidence for a false hypothesis, and we propose a simple, low-cost solution to this problem.

Co-Author: Leif D. Nelson, University of California, Berkeley

Co-Author: Uri Simonsohn, University of Pennsylvania

The P-Curve: Uncovering False-Positive Findings in Published Research (It Is Easier Than We Thought)
Uri Simonsohn
University of Pennsylvania
We introduce a test for diagnosing whether a set of statistically significant findings is likely to lead to a false positive. The test considers the distribution of p-values for such findings. Among other things, we show that a well-known failure to replicate was predictable given the p-curve of its previous "statistically significant" demonstrations.

Co-Author: Leif D. Nelson, University of California, Berkeley

Co-Author: Joseph Simmons, University of Pennsylvania

Subject Area: Methodology

Email Bookmark and Share