Psychological Science Submission Guidelines

Updated 4/15/22

Psychological Science welcomes the submission of papers presenting original research, theory, or applications on mind, brain, or behavior. Preference is given to papers that make a new and notable contribution—an idea, a discovery, a connection—to psychological science, broadly interpreted to include emerging as well as established areas of research (e.g., neuroeconomics versus psychophysics), across specialties of psychology and related fields, and that are written to be relevant for and intelligible to a wide range of readers.

Submission of Manuscripts

Manuscripts should be submitted electronically to the Psychological Science submission site, Before submitting your manuscript, please be sure to consult the Contributor FAQ.

Read the latest editorial policies from the APS Publications Committee.

Manuscript Review Process

Two-Tier Review

Two members of the editorial team read each submitted manuscript. One reader has interests and expertise in the relevant research area and offers a specialist’s opinion, whereas the other reader, who may be less knowledgeable in the subject matter, provides a generalist’s perspective. In most cases, the specialist reader is an Associate Editor, and the generalist reader is a Senior Editor or the Editor-in-Chief. In this initial review, manuscripts are anonymized as to authors and originating institutions. To facilitate this approach, authors will be asked to upload a anonymized version of the submission.

If either reader evaluates the paper as having a reasonable likelihood of ultimately being accepted for publication in the journal, then it is sent to two or more external referees for extended review. The Associate Editor usually oversees this process and writes the subsequent decision letter (accept, reject, or revise and resubmit). Alternatively, if both readers decide the paper is unlikely to be competitive for publication, then the paper is declined on initial editorial review.

Within 2 weeks of submission, authors are notified by e-mail that their manuscript either (a) has been declined on initial editorial review or (b) has been sent to outside experts for extended review. For manuscripts afforded extended review, authors can expect a decision within 60 days of manuscript submission. Manuscripts declined after either initial or extended review will not be reconsidered unless the responsible action editor has invited resubmission following revision (see Question 16 in the Contributor FAQ).

Upon submission, authors will be asked to identify a relevant editor whom they recommend for handling of their submission. Authors are required to name a minimum of two possible objective reviewers when submitting a proposal or manuscript. These recommendations should exclude former mentors and teachers, current colleagues, and collaborators. Please keep in mind that the editor will consider these recommendations but cannot guarantee that they will be honored.

Please note: Psychological Science uses StatCheck, an R program written by Sacha Epskamp and Michele B. Nuijten that is designed to detect inconsistencies between different components of inferential statistics (e.g., t value, df, and p). StatCheck is not designed to detect fraud, but rather to identify typographical errors (which occur often in psychology; see We ask authors to run StatCheck only after their manuscripts are sent out for extended review and not immediately rejected after extended review. Authors are encouraged to run StatCheck ( before submitting a manuscript, and authors of accepted manuscripts are required to provide a clean StatCheck report before the manuscript enters production (see below).

Back to Top

Criteria for Acceptance for Publication in Psychological Science

The main criteria for publication in Psychological Science are general theoretical and empirical significance and methodological/statistical rigor.

  • “General” because to warrant publication in this journal a manuscript must be of general interest to psychological scientists. Research that is likely to be read and understood only by specialists in a subdomain is better suited for a more specialized journal.
  • “Theoretical and empirical significance” because research published in Psychological Science should be strongly empirically grounded and should make a difference in the way psychologists and scholars in related disciplines think about important issues. Work that is purely descriptive or that only modestly extends knowledge of firmly established phenomena can be valuable but is unlikely to meet criteria for acceptance in this highly selective journal.
  • “Methodological/statistical rigor” because replicability is a foundational value of science. Replicability is not the only consideration, but it is an important one.  Science, like the rest of life, is full of trade-offs, and the editors at Psychological Science appreciate that it is more difficult to attain high levels of precision and replicability in some important areas of psychology than others.  Nonetheless, to succeed, submissions must be as rigorous as is practically and ethically feasible, and must also be frank in addressing limits on their precision and generality (see Simons, Shoda, & Lindsay, 2017).

The journal aims to publish works that meet these three criteria in a wide range of substantive areas of psychological science.  Historically, cognitive and social psychology have been dominant in this journal and research participants often are from a restricted range of the world’s population. Moreover, the majority of articles published in the journal are authored by scientists from the United States. The editors are encouraging of submissions from a broader span of areas within psychological science, including, for example, biological psychology, cognitive and affective neuroscience, communication and language, comparative, cross-cultural, developmental, gender and sexuality, and health (and this is not intended as a comprehensive list).  The editors also are encouraging of submissions of work with populations beyond the WEIRD world (Western, educated, and from industrialized, rich, and democratic countries), as well as of submissions that take psychological science into “the wild”—the natural contexts in which we live. The editors also are eager to receive submissions of work conducted by psychological scientists from around the world. Submissions centered on clinical science that meet the criteria outlined above will be considered, but many clinically oriented manuscripts are likely to be of primary interest to clinicians and hence are more appropriate for Clinical Psychological ScienceSimilarly, works with a primary focus on methods and research practices are generally better suited for Advances in Methods & Practices in Psychological Science, yet the editors are open to considering methodological manuscripts of extraordinary generality and importance.

Note that “theoretical significance” differs from “surprising novelty.” Indeed, surprising effects are subject to particularly careful scrutiny.  A direct replication that yields compelling evidence for (or against) a theoretically important but empirically uncertain phenomenon may well meet the “theoretical significance” criterion. (For ways to assess evidence for the null hypothesis, see, for example, Masson, 2011Harms & Lakens, 2018; and JASP.)

Back to Top

Transparency and Open Science

A critical component of scientific publications is to provide information sufficient for other researchers to replicate and expand upon authors’ published claims. To facilitate these activities, we strongly encourage Psychological Science authors to make preregistrations, materials, data, code/analysis scripts, and associated protocols available to readers. The preferred methods are to make these research components publicly available via a qualified third-party repository (e.g., OSF, ResearchBox, Figshare, Zenodo, etc.) or to include them as supplemental material. If these methods are not used, then upon request by readers, authors are required to make these research components available promptly and without undue qualifications. In some cases, some or all data or materials cannot be shared for legal, ethical, or other reasons. Any restrictions on the availability of materials or information should be outlined in the manuscript itself, in the Open Practices Statement. Readers who encounter refusal by the authors to comply with these policies should contact the Editor in Chief of the journal. In cases that cannot be resolved in a satisfactory manner, the journal may refer the matter to the authors’ funding institution and/or publish a formal statement of correction or expression of concern, attached online to the publication, stating that readers have been unable to obtain necessary materials to replicate the findings.

Information on the availability of and access to reported data, analytic methods/code, and materials, and whether the study (or studies) were preregistered, will be included in the Open Practices Statement.

Back to Top

Journals of the Association for Psychological Science (APS)

Psychological Science does not compete with other journals of APS, including Advances in Methods and Practices in Psychological ScienceClinical Psychological ScienceCurrent Directions in Psychological SciencePerspectives on Psychological Science, and Psychological Science in the Public Interest. The journals vary in terms of domain and manuscript formats. Manuscripts rejected by another APS journal on the grounds of quality (e.g., flaws in methodology, data, or concept) are not eligible for consideration by Psychological Science.

Back to Top

Preparing Your Manuscript

Article Types

See also Table 1 below.

Research Article. Most of the articles published in Psychological Science are Research Articles. Research articles make novel empirical and theoretical contributions that propel psychological science in substantial and significant ways. Research articles the primary contribution of which are meta-analytic treatments of established phenomena are likely more appropriate for specialty journals. The description and word limits of the sections of Research Articles can be found below.

Abstract and Statement of Relevance: All Research Articles must include a 150-word abstract that identifies the participant population on which the research was conducted. The abstract does not count toward the word limit. Immediately following the Abstract, authors also must include a 150-word Statement of Relevance that explains why the research reported in the submission is of interest and significance beyond the specific sub-area in which it is situated and, ideally, to the public at large. The Statement of Relevance does not count toward the word limit. The aim of the Statement of Relevance is to broaden the impact of the science reported in the journal and make it easier for interested readers to appreciate and understand our efforts. It should make clear why the questions that motivated the study and the findings that bear on them matter beyond psychology laboratories and college and university campuses. What is requested is a description of the sort that might open a conversation with a journalist, explain the work to a friend or family member, or introduce a student to the field of inquiry. In other words, a Statement of Relevance is not a technical abstract but instead, a description that makes the work accessible beyond the professional academe.

Introduction, Discussion, Footnotes, Acknowledgments, and Appendices: These sections may contain no more than 2,000 words combined. Authors are encouraged to be concise and focused in the Introduction and Discussion sections to keep them as brief as possible while also establishing the significance of the work. This word limit does not include the Abstract, Statement of Relevance, Method and Results sections (except footnotes), cover page, or reference list. In the Discussion (or General Discussion), authors should explicitly consider limits on the generalizability of their findings.

Method and Results: These sections of Research Articles do not count toward the total word limit. The aim of unrestricted length for Method and Results sections is to allow authors to provide clear, complete, self-contained descriptions of their studies. But as much as Psychological Science prizes narrative clarity and completeness, so too does it value concision. In almost all cases, an adequate account of method and results can be achieved in 2,500 or fewer words for Research Articles. Methodological minutiae and fine-grained details on the Results—the sorts of information that only “insiders” would relish and require for purposes of replication—should be placed in Supplemental Online Materials-Reviewed, not in the main text. Authors should include in their Method sections (a) justification for the sample(s) selected for the study (if the sample is of convenience, this should be explicitly noted); (b) the total number of excluded observations and the reasons for making the exclusions (if any); and (c) an explanation as to why the sample size is considered reasonable, supported by a formal power analysis, if appropriate. Authors also should include confirmation in their Method section that the research meets relevant ethical guidelines, including adherence to the legal requirements of the study country.

Many Research Articles contain two or more studies. Such submissions may include “interim” introductions and discussions that bracket the studies, in addition to an opening “general” introduction and a closing “general” discussion. Authors who opt for this sort of organization should bear in mind that the aforementioned word limits on introductory and Discussion sections include both interim and general varieties. Any combined “Results and Discussion” sections will be counted toward the word limit.

Narrative material that belongs in the Introduction or Discussion section should not be placed in the Method or Results section, within reasonable limits. Thus, for example, authors may include a few sentences to place their findings in context when they are presented in the Results section. However, excessive packing of a Method or Results section with material appropriate to the Introduction or Discussion will trigger immediate revision or rejection of the manuscript. Hybrid “Method & Results” sections are disallowed for any type of submission.

References: Authors are encouraged to cite only the sources that bear on the point directly, and to refrain from extensive parenthetical lists of related materials, keeping in mind that citations are meant to be supportive and not exhaustive. As a general rule, 40 citations should be sufficient for most Research Articles. However this is not a hard-and-fast limit, and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.

Short Report. As of May 15, 2020, Psychological Science is no longer considering Short Reports for publication. Manuscripts that previously would have been submitted in this category should now be submitted as Research Articles.

Preregistered Direct Replication. Preregistered Direct Replications (PDRs) report high-quality, preregistered, direct replications of studies published in Psychological Science.  (See former Editor-in-Chief Steve Lindsay’s 2017 editorial.) Authors are not permitted to submit PDRs of articles on which they are an author or co-author. Authors of a PDR must make a convincing case that the replication will make a valuable contribution to understanding a phenomenon and/or theory of broad current interest to psychological scientists. Authors also must preregister their protocol for the replication study to be considered for publication. Preregistration may be on the Open Science Framework ( or other recognized repository. As with all Psychological Science submissions, the primary criterion is general theoretical and empirical significance. Direct replications should reproduce the original methods and procedures as closely as possible, with the goal of measuring the same effect under essentially (but not necessarily superficially) the same conditions as in the original study. Moreover, it is likely that the theoretical and empirical importance of the work will be increased by additional manipulations or conditions—beyond those in the original study—to further elucidate the phenomenon under investigation.

Researchers undertaking a PDR project are encouraged to consult with the author or authors of the original article and to submit a proposal to the Editor-in-Chief. They also are encouraged to consult the document Guidelines for Preregistered Direct Replications in Psychological Science. As per the procedure for Registered Reports as described at, researchers are required to make a Stage 1 submission featuring an Introduction and Methods prior to data collection. Stage 1 submission is an essential step: PDR submissions that have not been through Stage 1 review prior to data collection will not be considered. Stage 1 submissions that are judged by the editorial board to be of sufficient quality and within journal scope will be sent for extended peer review. In considering Stage 1 submissions, reviewers will be asked to assess the importance of the research question(s); the logic, rationale, and plausibility of the proposed hypotheses; and the soundness and fidelity of the methodology and analysis plan. Assuming provisional acceptance of the PDR protocol (Stage 1 acceptance), authors are required to register their approved protocol on the Open Science Framework ( or other recognized repository (publicly or under private embargo that is retained until final acceptance of the Stage 2 manuscript). Authors may use any recognized registry, but the accepted protocol can be easily registered using a tailored mechanism for Registered Reports on the Open Science Framework ( Following registration, authors of RR-PDRs will conduct the study as planned. Once the study is complete, authors prepare and resubmit their manuscript for full review (Stage 2 submission/review).

Commentary. Commentaries use new data or new analyses of existing data to respond to and/or supplement articles previously published in Psychological Science. Although they may target an article published at any time, the Commentary mechanism is most effective when used to address a contemporary publication. The major criteria for a Commentary are that it provides a new empirical perspective through new data or new analyses of existing data and that the new empirical perspective has general theoretical significance. Before submitting a Commentary, authors are required to contact the Editor in Chief with a proposal. Except under special circumstances, for any given article, Psychological Science will consider only a single Commentary. No author may contribute to more than one Commentary on the same target article. Authors are not permitted to write Commentaries on articles on which they are an author or coauthor. Commentaries are limited to 1,000 words (includes main text, notes, acknowledgments, and appendices; does not include 150-word abstract, 150-word Statement of Relevance [see Research Article], cover page, Method and Results sections, or reference list), 20 references, and one figure (no more than two panels) or one table.

References: Authors are encouraged to cite only the sources that bear on the point directly and to refrain from extensive parenthetical lists of related materials, keeping in mind that citations are meant to be supportive and not exhaustive. As a general rule, 20 citations should be sufficient for most Commentaries. However, this is not a hard-and-fast limit, and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.

The Action Editor typically solicits a signed review of a submitted Commentary from the lead author of the target article, in addition to reviews by two (or more) independent experts. On acceptance of a Commentary, the Action Editor typically will invite the lead author of the target article to submit a Reply to Commentary (see below).

Reply to Commentary. Replies to Commentaries allow authors of articles that are targets of commentaries an opportunity to formally respond. Replies to Commentaries are by invitation only, and like the commentaries that initiated them, they are subject to external review (including by the author of the Commentary) and their acceptance is not assured. The major criterion for a Reply to Commentary is that it makes a unique scientific contribution that has general theoretical significance. The author of a target article who primarily wants to acknowledge a Commentary and/or make a relatively circumscribed observation about it might consider doing so through the Letters to the Editors mechanism (see below) rather than through a formal Reply to Commentary. Replies to Commentaries are limited to 1,000 words (includes main text, notes, acknowledgments, and appendices; does not include 150-word abstract, 150-word Statement of Relevance [see Research Article], cover page, Method and Results sections, or reference list), 20 references, and one figure (no more than two panels) or one table.

References: Authors are encouraged to cite only the sources that bear on the point directly and to refrain from extensive parenthetical lists of related materials, keeping in mind that citations are meant to be supportive and not exhaustive. As a general rule, 20 citations should be sufficient for most Replies. However, this is not a hard-and-fast limit, and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.

Letters to the Editors. Letters to the Editors provides a less formal (relative to a Commentary) forum for authors to make a comment, express a concern, develop an alternative interpretation, report the outcome of a test for replication, or make some other contribution to scientific exchange over an article published in Psychological Science (see Editor-in-Chief Patricia Bauer’s 2021 Introductory Editorial for more information). Letters to the Editors must be in response to—and make specific reference to—an article published in Psychological Science. They typically should be submitted within 1 year of print publication of the target article, in order to keep the scientific conversation fresh. (An exception to the 1-year time frame is the case of a report of a test for replication, which is likely to require more than 1 year to accomplish.) Although there is not an a priori limit on the number of Letters to the Editors on any given target article, Letters to the Editors that make essentially the same point as another will not be considered, and the editors may limit the total number of Letters to the Editors on any given target article. Except with permission of the editors, no author may contribute (or make authored contribution to) more than one letter on the same target article. Authors are not permitted to write letters on articles on which they are an author or coauthor. Letters to the Editors are limited to 500 words and 10 references (though references are not required). Letters to the Editors will not have an Abstract, Statement of Relevance, headings, or subheadings; tables and figures are not permitted (authors may provide a link to a repository that provides additional information or data).

In a departure from the typical approach to review of submissions, Letters to the Editors will be reviewed by two members of the editorial team (Editor in Chief, Senior Editors, Associate Editors). Solicitation of external review will be the exception rather than the rule. The major review criterion will be furthering scientific exchange. Authors of articles that are the subject of a Letter to the Editors may reply with a letter of their own, which will undergo the same editorial process.

Letters to the Editors will be disseminated online only to permit more rapid publication and to keep the discussion responsive and timely. They will not be copyedited. Letters to the Editors will have DOIs, but they will not be indexed (i.e., discoverable through PubMed, PsycInfo, etc.). Letters to the Editors will be linked with the target article to which they refer. To facilitate linkage, the title of each publication in this format will begin “Letter to the Editors of Psychological Science,” followed by a brief original title, followed by “Regarding [insert citation of the target article]” (the title will not count against the 500-word limit).

Table 1. Limits for Psychological Science Articles by Type

Article Type

Word Limit

Reference Limit*

Figure and/or Table Limit

What counts toward the word limit?

Introduction & Discussion

Method & Results

Notes, Acknowledgments, Appendices

Cover Page, Abstract, Statement of Relevance, References

Research Article








Preregistered Direct Replication








Commentary or Reply to Commentary








Letter to the Editors




Letters feature main text only; notes, acknowledgements, tables, figures, and appendices are not permitted. There is no cover page, abstract, or statement of relevance.

*These are not hard-and-fast limits and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.

**For commentaries reporting new data, Method and Results sections are not included in the word count.

Back to Top

Manuscript Style, Structure, and Content

Manuscripts published in Psychological Science must follow the style of the Publication Manual of the American Psychological Association, 7th edition, with respect to handling of the order of manuscript sections, headings and subheadings, references, abbreviations, and symbols. Please embed tables and figures within the main text. For initial submissions, authors may deviate from some of the style requirements (e.g., heading and subheading style, reference format, location of tables and figures). However, invited revisions and final versions of manuscripts must follow APA style. For all article types, for initial review, manuscripts must be anonymized as to authors and originating institutions. To facilitate this approach, authors will be asked to upload a anonymized version of the submission.

Further guidance can be found on our Manuscript Structure, Style, and Content Guidelines page.

File Types

You may upload your manuscript and ancillary files as Word .doc or .docx, as .rtf, as .pdf, or as .tex. If you submit a .tex file, please also submit a PDF file conversion, as the submission portal cannot render .tex files in the PDF proof.

Citation Standards

All data, program code, and other methods must be appropriately cited in accordance with the Guidelines for Transparency and Openness Promotion (TOP) in Journal Policies and Practices. Such materials should be recognized as original intellectual contributions and afforded recognition through citation.

  • All data sets and corresponding analysis code used in a publication must be cited in the text and listed in the reference section.
  • References for data sets and program code must include a digital object identifier (DOI). Persistent identifiers ensure future access to unique published digital objects, such as a text or data set. Persistent identifiers are assigned to data sets by digital archives, such as institutional repositories and partners in the Data Preservation Alliance for the Social Sciences (Data-PASS).

Data set citation examples:

Campbell, A., & Kahn, R. L. (2015). ANES 1948 Time Series Study (ICPSR 7218, Version V4) [Data set]. ICSPR.

Kidwell, M. C., Lazarevic, L., Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L.-S., Kennett, C., Slowik, A., Sonnleitner, C., Hess-Holden, C. L., Errington, T. M., Fiedler, S., & Nosek, B. A. (2016, August 18). Badges to acknowledge open practices: A simple, low cost, effective method for increasing transparency. OSF.

Back to Top

Design and Analysis Transparency

Authors are expected to follow appropriate standards for disclosing key aspects of the research design and data-analysis plan. Authors are required to review the standards available for many research applications from and use those that are relevant for the reported research applications. At manuscript submission, authors must confirm that they reviewed the standards, report whether any standards were relevant for the research application, and confirm that they followed those standards in the manuscript.

Open Practices

Preregistration of Studies and Design and Analysis Plan

To promote replicability and transparency, Psychological Science strongly encourages authors to preregister their studies and, when possible, to do so before data collection (see Lindsay, Simons, & Lilienfeld, 2017Veldkamp et al., 2018). The preregistration must include the data-analysis plan. As one option, OSF provides a step-by-step workflow for preregistering a research project at During submission, and following TOP Guidelines, authors must indicate whether they have preregistered the research in an independent institutional registry. If a preregistration exists, authors must include information about how to access the time-stamped preregistrations in their Open Practices Statement. Starting in 2023, Psychological Science will ensure that these plans were carried out, or that changes were clearly disclosed. For more information, see Editor-in-Chief Patricia Bauer’s February 2022 editorial, “Psychological Science Stepping Up a Level.”

Data/Code/Materials Transparency

Authors are strongly encouraged to make their data, analysis/code, and materials available to reviewers during the peer review process and, should the article be accepted, publicly accessible to readers. Prospective submitters of manuscripts are encouraged to consult the resources below for information on writing transparent research reports. Specifically, authors using original data are strongly encouraged to complete the following steps:

  • Make the data available at a permanent third-party digital repository, such as OSF, Zenodo, or ResearchBox. (If all data required to reproduce the reported analyses appear in the article text, tables, and figures, then they do not also need to be posted to a repository.)
  • Include all variables, treatment conditions, and observations described in the manuscript.
  • Provide a full account of the procedures used to collect, preprocess, clean, or generate the data described in the manuscript.
  • Provide program code, scripts, codebooks, and other documentation sufficient to precisely reproduce all published results.
  • Provide research materials and description of procedures necessary to conduct an independent replication of the research.

Authors using data available from public repositories (e.g., NIMH Data Archive, UK Data Service, SHARE data archive) are encouraged to provide program code, scripts for statistical packages, and other documentation sufficient to allow an informed researcher to precisely reproduce all published results.

In some cases, some or all data or materials cannot be shared for legal, ethical, or other reasons, including the following:

  • Third-party data/materials/code over which the authors do not have control or rights to share
  • Proprietary data/materials/code not owned by the authors
  • Exercise of right of first use when some or all of the data will be used again by the authors
  • Ethical, legal, or institutional review board (IRB; or similar body) restrictions because of sensitive data and/or participant privacy concerns
  • Ethical, legal, or IRB restrictions not related to confidentiality concerns
  • Threats to scientific validity
  • Materials widely or publicly available

Authors who are unable to share their data/code and/or materials because of one of the above reasons or for another valid reason may still qualify to receive an Open Science Badge.

Open Practices Statement

Psychological Science encourages researchers to preregister their research plans and (to the extent practically feasible and ethically, legally, and scientifically appropriate) to make their data and materials available to other scientists (for rationales, see Lindsay, 20152017, and Lindsay, Simons, & Lilienfeld, 2016). As part of that encouragement, each manuscript reporting new empirical work must include an Open Practices Statement that indicates whether the study (or studies) reported were preregistered and whether the data, analysis/code, and/or materials are available on a permanent third-party archive (or are included in Supplemental Materials). This statement should be a separate section, inserted at the end of the Introduction section before the Method section, and it does not count against the word limits. If data and/or materials are not available on a permanent third-party archive or included in Supplemental Materials, authors are required to inform readers how they may access study-related materials and data, or of restrictions thereon. These statements should be direct and to the point. Authors are not required to offer explanations for the restrictions, but they may do so if they choose. Commentaries that do not report new data do not need to include an Open Practices Statement.

Statement template:

“The data for [study] are/are not publicly accessible [at ADDRESS]. The code for [study] is/is not publicly accessible [at ADDRESS]. The materials are/are not publicly accessible [at ADDRESS]. There is/is not a preregistration for [study] here [ADDRESS].”

All of the following are examples of acceptable statements:

“Neither of the studies reported in this article was preregistered. The data have not been made available on a permanent third-party archive because our institutional review board ruled that we could not post the data; requests for the data can be sent to the corresponding author. The complete questionnaires are included in the Supplemental Material associated with this article.”

“Experiment 1 was not preregistered; the preregistration for Experiment 2 can be accessed at [ADDRESS]. Deidentified data for both experiments along with a codebook and the data-analysis scripts are posted at [ADDRESS]; access to the data is limited to qualified researchers. The materials used in these studies are widely available.”

“We analyzed archival data that are not under our direct control; requests to access the data should be directed to the relevant archive. Our complete analysis scripts and code book have been posted at [ADDRESS].”

Questions, concerns, and suggestions regarding this Open Practices Statement can be directed to

Open Science Resources

Authors new to practicing open science may find the following free resources useful:



Preparation of Graphics

The journal requires that for accepted manuscripts, figures be embedded within the main document near to where they are discussed in the text. A figure’s caption should be placed in the text just below the figure. For initial submissions, tables and figures may be placed at the end of the manuscript.

Authors who are submitting revisions should also upload separate figure files that adhere to the APS Figure Format and Style Guidelines. Because the submission should be anonymized, files must not contain an author’s name. Submitting separate, production-quality files helps to facilitate timely publication should the manuscript ultimately be accepted.

Back to Top


Psychological Science recommends the use of the “new statistics”—effect sizes, confidence intervals, and meta-analysis—to avoid problems associated with null-hypothesis significance testing (NHST). Authors are encouraged to consult this Psychological Science tutorial by Geoff Cumming, which argues that estimation and meta-analysis are more informative than NHST and that they foster development of a cumulative, quantitative discipline. Cumming has also prepared a video workshop on the new statistics that can be found here.

Authors must include effect sizes for their major results and distributional information in their tables and graphs. Fine-grained graphical presentations that show how data are distributed are often the most transparent way of communicating results. Please report 95% confidence intervals instead of standard deviations or standard errors around mean dependent variables, because confidence intervals convey more useful information—another point discussed in Cumming’s tutorial.

Reporting Statistical Results

The abstract should include information about the sample size(s) in studies reported in the manuscript. Please report test statistics with two decimal points (e.g., t(34) = 5.67) and probability values with three decimal points. In addition, exact p values should be reported for all results greater than .001; p values below this range should be described as “p < .001.” Authors should be particularly attentive to APA style when typing statistical details (e.g., Ns for chi-square tests, formatting of dfs), and if special mathematical expressions are required, they should not be graphic objects but rather inserted with Word’s Equation Editor or similar.

Back to Top

Guidelines for Reporting fMRI Data

Studies involving fMRI or other neuroimaging methods typically entail larger numbers of measures than those found in behavioral research. Analyses of neuroimaging data have their own well-developed statistical frameworks and reporting standards. We recognize that the focus of fMRI analyses is often not on effect sizes, but rather on (a) statistical reliability and (b) replicability. We refer authors to Poldrack et al. (2008) for a useful list of reporting guidelines for such analyses.

We also recognize that adherence to the kind of complete reporting suggested by Poldrack et al. (2008) may require partitioning some of this material to Supplemental Online Materials (SOM). Most of the items listed by Poldrack et al. (2008) in their Appendix A can be succinctly and completely described in the main text. Exceptions that should be put into SOM are sections of the appendix labeled “Intersubject registration” and “Statistical modeling” (if nonstandard).

For region of interest (ROI)-based analyses, the process of ROI selection should be clearly stated in the main text; in particular, it should be noted whether the ROI was selected prior to any analyses of the data. Reports of ROI analyses should include effect sizes. Any ROI-based analyses should be supplemented by whole-brain analyses. A complete table of activation coordinates, together with their statistics, should be provided; however, such tables should typically be put in SOM.

The strongest submissions will address both reliability and replicability and will report results for appropriately large samples and/or a replication study. For instance, a study concerning correlations of brain activation with behavioral measures may typically require a sample in excess of 100 subjects, although we recognize that sample size will vary widely depending on the details of the study. Power analyses are recommended (see Mumford, 2012).

Psychological Science will place emphasis on those functional neuroimaging studies that make a clear and compelling contribution to understanding psychological mechanisms, above and beyond a purely neuroanatomical contribution. Therefore, authors should carefully support reverse-inference statements in their Results and Discussion sections. Such support could come from strong prior results in the literature, as well as from separate meta-analyses (see Poldrack, 2011).

Back to Top

Candidate Gene Research

The editors of Behavior Genetics have established perceptive policies regarding candidate gene association and Candidate Gene × Environment interaction studies of complex traits (Hewitt, 2012). Submissions to Psychological Science that report similar candidate-gene studies are expected to follow those policies.

Back to Top

Research Disclosure Statements

Submitting authors must declare that they have disclosed (a) all of the dependent variables or measures collected, (b) all of the conditions/groups/predictors tested for each study reported in the submitted manuscript, and (c) any data exclusions (subjects or observations). The Disclosure Statement section looks like this:

For all studies reported in your manuscript, check the boxes below to confirm that:

  • All dependent variables or measures that were analyzed for this article’s target research question have been reported in the Methods section(s)
  • All levels of all independent variables or all predictors or manipulations, whether successful or failed, have been reported in the Method section(s)
  • The total number of excluded observations and the reasons for making those exclusions (if any) have been reported in the Method section(s)

Sample Composition and Size

Submitting authors are asked to (a) identify the participant population in the abstract; and (b) in the methods, explain the basis(es) for the composition of their samples (whether the sample was selected for specific theoretical or conceptual reasons, is a sample of convenience, etc.). In the Discussion (or General Discussion), authors are asked for explicit consideration of the limits on the generalizability of their findings. Authors also should explain the basis(es) for the sample sizes in the studies included in the submission. Bakker et al. (2016) reported evidence that many published research psychologists have faulty intuitions regarding statistical power. For many years it was standard practice to conduct studies with low statistical power and submit for publication those studies that obtained statistically significant results (Cohen, 1969). Such practices lead to exaggerated estimates of effect size. Indeed, when statistical power is very low, only results that exaggerate the true size of an effect can be statistically significant. Therefore, it is typically not appropriate to base sample size solely on the sample sizes and/or effect sizes reported in prior research or on the results of small pilot studies (see, e.g., Gelman & Carlin, 2014). There is no single right answer to this question, but authors must explain in the manuscript the basis(es) for determination that their sample size is appropriate. If an estimate of the size of an effect is given, the unit of measurement (e.g., Cohen’s d) must be specified and some rationale for assumption that the estimate is sound must be provided. If the study tests more than one effect, authors must make clear which of those effects their power analysis was based upon.

Submitters are also asked if they conducted preliminary analyses on the data and decided whether or not to collect additional data based on the outcome of those analyses. That practice, known as “optional stopping,” inflates the risk of making a Type I error (see Simmons, Nelson, & Simonsohn, 2011).

Back to Top

Author Contributions

Authorship implies significant participation in the research reported or in writing the manuscript, including participation in the design and/or interpretation of reported experiments or results, participation in the acquisition and/or analysis of data, and participation in the drafting and/or revising of the manuscript. All authors must agree to the order in which the authors are listed and must have read the final manuscript and approved its submission. They must also agree to take responsibility for the work in the event that its integrity or veracity is questioned.

Furthermore, as part of our commitment to ensuring an ethical, transparent, and fair peer review and publication process, APS journals have adopted the use of CRediT (Contributor Roles Taxonomy). CRediT is a high-level taxonomy, including 14 roles that can be used to represent the roles typically played by contributors to scientific scholarly output.

These roles describe the possible contributions to the published work:

Conceptualization: Ideas; formulation or evolution of overarching research goals and aims

Methodology; Development or design of methodology; creation of models

Software: Programming, software development; designing computer programs; implementation of the computer code and supporting algorithms; testing of existing code components

Validation Verification, whether as a part of the activity or separate, of the overall replication/ reproducibility of results/experiments and other research outputs

Formal Analysis Application of statistical, mathematical, computational, or other formal

techniques to analyze or synthesize study data

Investigation: Conducting a research and investigation process, specifically performing the experiments, or data/evidence collection

Resources: Provision of study materials, reagents, materials, patients, laboratory samples, animals, instrumentation, computing resources, or other analysis tools

Data Curation: Management activities to annotate (produce metadata), scrub data and maintain research data (including software code, where it is necessary for interpreting the data itself) for initial use and later reuse

Writing – Original Draft: Preparation, creation and/or presentation of the published work, specifically writing the initial draft (including substantive translation)

Writing – Review & Editing: Preparation, creation and/or presentation of the published work by those from the original research group, specifically critical review, commentary or revision–including pre- or postpublication stages

Visualization: Preparation, creation and/or presentation of the published work, specifically visualization/ data presentation

Supervision: Oversight and leadership responsibility for the research activity planning and execution, including mentorship external to the core team

Project Administration: Management and coordination responsibility for the research activity planning and execution

Funding Acquisition: Acquisition of the financial support for the project leading to this publication.

The submitting author is responsible for listing the contributions of all authors at submission. All authors should agree to their individual contributions prior to submission.

In order to adhere to SAGE’s authorship criteria authors must have been responsible for at least one of the following CRediT roles:

  • Conceptualization
  • Methodology
  • Formal Analysis
  • Investigation

AND at least one of the following:

  • Writing – Original Draft Preparation
  • Writing – Review & Editing

Contributions will be published with the final article, and they should accurately reflect all contributions to the work. Any contributors with roles that do not constitute authorship (e.g., Supervision was the sole contribution) should be listed in the Acknowledgements.

SAGE is a supporting member of ORCID, the Open Researcher and Contributor ID. We strongly encourage all authors and co-authors to use ORCID iDs during the peer-review process. If you already have an ORCID iD, please login to your account on SAGE Track and edit the account information to link to your ORCID iD. If you do not already have an ORCID iD, please login to your SAGE Track account to create your unique identifier and automatically add it to your profile. PLEASE NOTE: ORCID iDs must be linked to author accounts prior to manuscript acceptance or they will not be displayed upon publication. ORCID iDs cannot be linked during the copyediting phase.

Back to Top

Supplemental Material

Authors are free to submit certain types of Supplemental Material (SOM) for online-only publication. If the manuscript is accepted for publication, such material will be published online on the publisher’s website via Figshare, linked to the article. SOM will not be copyedited or formatted; it will be posted online exactly as submitted.

The editorial team takes the adjective supplemental seriously. SOM should include the sort of material that enhances the reader’s understanding of an article but is not essential for understanding the article. SOM files should be uploaded during initial submission.

Back to Top

Contributor FAQ

Contributors are encouraged to consult the Contributor FAQ before submitting manuscripts to Psychological Science.

Back to Top

Accepted Manuscripts

Open Science Badges

Manuscripts accepted for publication are eligible to earn one or more Open Science Badges. Badges are awarded for promoting openness in science through sharing of data and/or materials, and making study methods, procedures, and analyses more transparent. They are neither a direct nor indirect means of asserting that publications with one or more badges are of higher quality than publications without them.

The following badges are available:

  • Open Data badge for making publicly available the digitally shareable data necessary to reproduce the reported result. This includes annotated copies of the code or syntax used for all exploratory and principal analyses. If the data contains sensitive, personal information, a PM (Protected Access) notation will be added if the authors post their data according to the guidelines regarding protected access repositories (see below for more information about this notation).
  • Open Materials badge for making publicly available the digitally shareable materials/methods necessary to reproduce the reported results. Authors are also encouraged to make publicly available video recordings of their study procedures; in return, a (VM)Visualized Methods notation will be added to the authors’ Open Materials badge (see below for more information about this notation).
  • Preregistered badge for having a preregistered design and analysis plan for the reported research and reporting results according to that plan. An analysis plan includes specification of the variables and the analyses that will be conducted. Please note that “Preregistration” does not require all analyses to be confirmatory (planned in advance); it merely requires investigators to state up front which analyses are confirmatory and which are exploratory. High-quality exploratory research is more than welcome in the pages of Psychological Science, as long as authors explicitly acknowledge that the analyses are exploratory and, when relevant, acknowledge potential constraints on the replicability of these findings. If the analysis plan was registered prior to observation of outcomes, the Open Practices note will include the notation DE (Data Exist). If there were strongly justified changes to an analysis plan, the Open Practices note will include the notation TC (Transparent Changes). Authors who have additional unreported registrations or unreported analyses without strong justification (as determined by the Editor-in-Chief) will not qualify for a badge. Please see former Editor-in-Chief Steve Lindsay’s statement on preregistration for more information.

To apply for one or more of these badges acknowledging open practices, authors must provide the information requested in the Open Practices Disclosure form, which is sent to all authors of accepted manuscripts. Unless the authors decide not to apply for badges, the form will be published with the article as supplemental online material.

Badge icons will be displayed at the beginning of the article, and information related to open practices will be published in a note titled “Open Practices” that will appear at the end of the article. Badges are awarded following the disclosure method, in which authors provide public statements affirming achievement of badge criteria.

The criteria for earning badges and the process by which they are awarded, along with answers to frequently asked questions, are described in the Open Science Framework wiki. Please see former Editor-in-Chief Eric Eich’s Observer interview for more information.

More about the Protected Access notation (Open Data):

The Protected Access notation may be added to Open Data badges if sensitive, personal data are available from an approved protected access repository. These repositories manage access to such data to qualified researchers who complete a documented process which the repository publicly describes. This notation is not available to researchers who state that they will make “data available upon request” and is not available if requests for data sharing are evaluated on any criteria beyond considerations for compliance with proper handling of sensitive data. Please view the Approved Protected Access Repositories list  for more information.

More about the Visualized Methods notation (Open Materials):

Even with uncapped Method sections in Research Articles and Research Reports (see Table 1), there is only so much an author can convey through words; researchers who want to follow up on someone’s paper might benefit by seeing how things were actually done. Videos of study procedures could also serve as valuable teaching tools for psychology students, undergraduate and graduate alike.

Making such videos publicly available should facilitate obtaining an Open Materials badge, but would not necessarily be sufficient to earn one. For instance, in the case of a study with a computer-delivered behavioral task, the script would need to be available in order for another researcher to reproduce the procedure; a video of someone completing the computer task would not be sufficient. However, the video would make it a good deal easier for some kinds of studies to earn the Open Materials badge, such as those that entail a social interaction of some kind.

By awarding an Open Materials badge with Visualized Methods notation, Psychological Science aims to promote open behavior and to recognize that a video can be very useful for certain manuscripts; however, the journal is not trying to say that a manuscript without a video is somehow inferior to one with a video (which may not make sense in all cases).

Journal staff will contact the corresponding authors of accepted manuscripts with details on the badge-awarding process.

Back to Top


StatCheck is an R program that is designed to detect inconsistencies between different components of inferential statistics (e.g., t value, df, and p). StatCheck is not designed to detect fraud, but rather to catch typographical errors (see for more about StatCheck). Authors of accepted manuscripts must also provide a StatCheck report run on the accepted version of the manuscript that indicates a clean (i.e., error-free) result. A web app version of StatCheck can be accessed at If StatCheck does detect errors in the accepted version of the manuscript, authors should contact the action editor directly to determine the best course of action.

Back to Top

OnlineFirst Publication and TWiPS

All accepted manuscripts are published online (OnlineFirst) as soon as they reach their final copyedited, typeset, and corrected form, and each accepted article appears in a monthly print issue of Psychological Science as well as in the digital This Week in Psychological Science (TWiPS), which is distributed weekly to all APS members.

Back to Top