Psychological Science welcomes the submission of papers presenting original research, theory, or applications on mind, brain, or behavior. Preference is given to papers that make a new and notable contribution—an idea, a discovery, a connection—to psychological science, broadly interpreted to include emerging as well as established areas of research (e.g., neuroeconomics versus psychophysics), across specialties of psychology and related fields, and that are written to be relevant for and intelligible to a wide range of readers.
Submission of Manuscripts
Manuscripts should be submitted electronically to the Psychological Science submission site, http://mc.manuscriptcentral.com/psci. Before submitting your manuscript, please be sure to consult the Contributor FAQ.
Read the latest editorial policies from the APS Publications Committee.
- Manuscript Review Process
- Preparing Your Manuscript
- Article Types
- Manuscript Style, Structure, and Content
- Preparation of Graphics
- Guidelines for Reporting fMRI Data
- Candidate Gene Research
- Research Disclosure Statements
- Author Contributions
- Open Practices Statement
- Ethical Considerations
- Conference Proceedings
- Supplemental Material
- English Language Help
- Contributor FAQ
- Accepted Manuscripts
Two members of the editorial team read each submitted manuscript. One reader has interests and expertise in the relevant research area and offers a specialist’s opinion, whereas the other reader, who may be less knowledgeable in the subject matter, provides a generalist’s perspective. In most cases, the specialist reader is an Associate Editor, and the generalist reader is a Senior Editor or the Editor-in-Chief. In this initial review, manuscripts are masked as to authors and originating institutions. To facilitate this approach, authors will be asked to upload a masked version of the submission.
If either reader evaluates the paper as having a reasonable likelihood of ultimately being accepted for publication in the journal, then it is sent to two or more external referees for extended review. The Associate Editor usually oversees this process and writes the subsequent decision letter (accept, reject, or revise and resubmit). Alternatively, if both readers decide the paper is unlikely to be competitive for publication, then the paper is declined on initial editorial review.
Within 2 weeks of submission, authors are notified by e-mail that their manuscript either (a) has been declined on initial editorial review or (b) has been sent to outside experts for extended review. For manuscripts afforded extended review, authors can expect a decision within 60 days of manuscript submission. Manuscripts declined after either initial or extended review will not be reconsidered unless the responsible action editor has invited resubmission following revision (see Question 16 in the Contributor FAQ).
Upon submission, authors will be asked to identify a relevant Associate Editor as well as at least one member of the Editorial Board whom they recommend for handling of their submission. Authors also may submit a list of recommended (and opposed) reviewers. Authors often are more familiar with experts in their area of research, and editors appreciate the suggestions. Keep in mind that editors will consider these recommendations and requests but do not guarantee that they will be honored.
Please note: Psychological Science uses StatCheck, an R program written by Sacha Epskamp and Michele B. Nuijten that is designed to detect inconsistencies between different components of inferential statistics (e.g., t value, df, and p). StatCheck is not designed to detect fraud, but rather to identify typographical errors (which occur often in psychology; see https://mbnuijten.com/statcheck/). We ask authors to run StatCheck only after their manuscripts are sent out for extended review and not immediately rejected after extended review. Authors are encouraged to run StatCheck (http://statcheck.io/) before submitting a manuscript, and authors of accepted manuscripts are required to provide a clean StatCheck report before the manuscript enters production (see below).
The main criteria for publication in Psychological Science are general theoretical and empirical significance and methodological/statistical rigor.
- “General” because to warrant publication in this journal a manuscript must be of general interest to psychological scientists. Research that is likely to be read and understood only by specialists in a subdomain is better suited for a more specialized journal.
- “Theoretical and empirical significance” because research published in Psychological Science should be strongly empirically grounded and should make a difference in the way psychologists and scholars in related disciplines think about important issues. Work that is purely descriptive or that only modestly extends knowledge of firmly established phenomena can be valuable but is unlikely to meet criteria for acceptance in this highly selective journal.
- “Methodological/statistical rigor” because replicability is a foundational value of science. Replicability is not the only consideration, but it is an important one. Science, like the rest of life, is full of trade-offs, and the editors at Psychological Science appreciate that it is more difficult to attain high levels of precision and replicability in some important areas of psychology than others. Nonetheless, to succeed, submissions must be as rigorous as is practically and ethically feasible, and must also be frank in addressing limits on their precision and generality (see Simons, Shoda, & Lindsay, 2017).
The journal aims to publish works that meet these three criteria in a wide range of substantive areas of psychological science. Historically, cognitive and social psychology have been dominant in this journal and research participants often are from a restricted range of the world’s population. Moreover, the majority of articles published in the journal are authored by scientists from the United States. The editors are encouraging of submissions from a broader span of areas within psychological science, including, for example, biological psychology, cognitive and affective neuroscience, communication and language, comparative, cross-cultural, developmental, gender and sexuality, and health (and this is not intended as a comprehensive list). The editors also are encouraging of submissions of work with populations beyond the WEIRD world (Western, educated, and from industrialized, rich, and democratic countries), as well as of submissions that take psychological science into “the wild”—the natural contexts in which we live. The editors also are eager to receive submissions of work conducted by psychological scientists from around the world. Submissions centered on clinical science that meet the criteria outlined above will be considered, but many clinically oriented manuscripts are likely to be of primary interest to clinicians and hence are more appropriate for Clinical Psychological Science. Similarly, works with a primary focus on methods and research practices are generally better suited for Advances in Methods & Practices in Psychological Science, yet the editors are open to considering methodological manuscripts of extraordinary generality and importance.
Note that “theoretical significance” differs from “surprising novelty.” Indeed, surprising effects are subject to particularly careful scrutiny. A direct replication that yields compelling evidence for (or against) a theoretically important but empirically uncertain phenomenon may well meet the “theoretical significance” criterion. (For ways to assess evidence for the null hypothesis, see, for example, Masson, 2011; Harms & Lakens, 2018; and JASP.)
To promote replicability and transparency, authors are encouraged to preregister their studies (including data-analysis plans) before conducting them (see Lindsay, Simons, & Lilienfeld, 2017; Veldkamp et al., 2018). As one option, the Open Science Framework provides a step-by-step workflow for pregistering a research project at osf.io/prereg. Researchers are also asked to make their materials, data, and analysis scripts available to reviewers (in ways that are ethically appropriate and practically feasible). Although deposit of materials, data, and analysis scripts into an open access repository is not required for publication in Psychological Science, in manuscripts accepted by the journal, authors will be required to inform readers how they may access study-related data and materials, or of restrictions thereon. Prospective submitters of manuscripts are encouraged to read former Editor-in-Chief Eric Eich’s 2014 editorial, former Editor-in-Chief Steve Lindsay’s 2015 and 2016 editorials, and an in-press chapter by Mellor, Vazire, and Lindsay on writing transparent research reports.
Psychological Science does not compete with other journals of APS, including Advances in Methods and Practices in Psychological Science, Clinical Psychological Science, Current Directions in Psychological Science, Perspectives on Psychological Science, and Psychological Science in the Public Interest. The journals vary in terms of domain and manuscript formats. Manuscripts rejected by another APS journal on the grounds of quality (e.g., flaws in methodology, data, or concept) are not eligible for consideration by Psychological Science.
See also Table 1 below.
Research Article. Most of the articles published in Psychological Science are Research Articles. Research articles make novel empirical and theoretical contributions that propel psychological science in substantial and significant ways. The description and word limits of the sections of Research Articles can be found below.
Abstract and Statement of Relevance: All Research Articles must include a 150-word abstract that identifies the participant population on which the research was conducted. The abstract does not count toward the word limit. Immediately following the Abstract, authors also must include a 150-word Statement of Relevance that explains why the research reported in the submission is of interest and significance beyond the specific sub-area in which it is situated and, ideally, to the public at large. The Statement of Relevance does not count toward the word limit. The aim of the Statement of Relevance is to broaden the impact of the science reported in the journal and make it easier for interested readers to appreciate and understand our efforts. It should make clear why the questions that motivated the study and the findings that bear on them matter beyond psychology laboratories and college and university campuses.
Introduction, Discussion, Footnotes, Acknowledgments, and Appendices: These sections may contain no more than 2,000 words combined. Authors are encouraged to be concise and focused in the Introduction and Discussion sections to keep them as brief as possible while also establishing the significance of the work. This word limit does not include the Abstract, Statement of Relevance, Method and Results sections (except footnotes), cover page, Author Contributions, or reference list. In the Discussion (or General Discussion), authors should explicitly consider limits on the generalizability of their findings.
Method and Results: These sections of Research Articles do not count toward the total word limit. The aim of unrestricted length for Method and Results sections is to allow authors to provide clear, complete, self-contained descriptions of their studies. But as much as Psychological Science prizes narrative clarity and completeness, so too does it value concision. In almost all cases, an adequate account of method and results can be achieved in 2,500 or fewer words for Research Articles. Methodological minutiae and fine-grained details on the Results—the sorts of information that only “insiders” would relish and require for purposes of replication—should be placed in Supplemental Online Materials-Reviewed, not in the main text. Authors should include in their Method sections (a) justification for the sample(s) selected for the study (if the sample is of convenience, this should be explicitly noted); (b) the total number of excluded observations and the reasons for making the exclusions (if any); and (c) an explanation as to why the sample size is considered reasonable, supported by a formal power analysis, if appropriate. Authors also should include confirmation in their Method section that the research meets relevant ethical guidelines, including adherence to the legal requirements of the study country.
Many Research Articles contain two or more studies. Such submissions may include “interim” introductions and discussions that bracket the studies, in addition to an opening “general” introduction and a closing “general” discussion. Authors who opt for this sort of organization should bear in mind that the aforementioned word limits on introductory and Discussion sections include both interim and general varieties. Any combined “Results and Discussion” sections will be counted toward the word limit.
Narrative material that belongs in the Introduction or Discussion section should not be placed in the Method or Results section, within reasonable limits. Thus, for example, authors may include a few sentences to place their findings in context when they are presented in the Results section. However, excessive packing of a Method or Results section with material appropriate to the Introduction or Discussion will trigger immediate revision or rejection of the manuscript. Hybrid “Method & Results” sections are disallowed for any type of submission.
References: Authors are encouraged to cite only the sources that bear on the point directly, and to refrain from extensive parenthetical lists of related materials, keeping in mind that citations are meant to be supportive and not exhaustive. As a general rule, 40 citations should be sufficient for most Research Articles. However this is not a hard-and-fast limit, and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.
Short Report. As of May 15, Psychological Science is no longer considering Short Reports for publication. Manuscripts that previously would have been submitted in this category should now be submitted as Research Articles.
Commentary. Commentaries respond to and/or supplement articles previously published in Psychological Science. Although they may target an article published at any time, the Commentary mechanism is most effective when used to address a contemporary publication. Before submitting a Commentary, authors are required to contact the Editor-in-Chief with a proposal. Except under special circumstances, for any given article, Psychological Science will consider only a single Commentary. No author may contribute to more than one Commentary on the same target article. Commentaries are limited to 1,000 words (includes main text, notes, acknowledgments, and appendices; does not include 150-word abstract, 150-word Statement of Relevance (see Research Article), cover page, Author Contributions, or reference list) and 1 figure (no more than 2 panels) or 1 table.
References: Authors are encouraged to cite only the sources that bear on the point directly, and to refrain from extensive parenthetical lists of related materials, keeping in mind that citations are meant to be supportive and not exhaustive. As a general rule, 20 citations should be sufficient for most Commentaries. However this is not a hard-and-fast limit and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.
Note: For Commentaries or Replies to Commentaries reporting new data, the Method and Results sections are not included in the word count.
The action editor typically solicits feedback on a submitted Commentary from the lead author of the target article, in addition to reviews by two independent experts. Upon acceptance of a Commentary that is critical of a paper previously published in Psychological Science, the action editor will invite the lead author of the target article to submit a Reply to Commentary (see below).
Reply to Commentary. Replies to Commentaries allow authors of articles that are targets of critical commentaries an opportunity to formally respond. Replies to Commentaries are by invitation only, and like the critical commentaries that initiated them, they are subject to external review (including by the author of the Commentary) and their acceptance is not assured. Replies to Commentaries are limited to 1,000 words (includes main text, notes, acknowledgments, and appendices; does not include 150-word abstract, 150-word Statement of Relevance (see Research Article), cover page, Author Contributions, or reference list) and 1 figure (no more than 2 panels) or 1 table.
References: Authors are encouraged to cite only the sources that bear on the point directly, and to refrain from extensive parenthetical lists of related materials, keeping in mind that citations are meant to be supportive and not exhaustive. As a general rule, 20 citations should be sufficient for most Replies. However this is not a hard-and-fast limit and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.
The criterion for Commentaries and Replies to Commentaries is general theoretical significance.
Preregistered Direct Replication. Preregistered Direct Replications (PDRs) report high-quality, preregistered, direct replications of studies published in Psychological Science. (See former Editor-in-Chief Steve Lindsay’s 2017 editorial.) Authors of a PDR must make a convincing case that the replication will make a valuable contribution to understanding a phenomenon and/or theory of broad current interest to psychological scientists. Authors also must preregister their protocol for the replication study to be considered for publication. Preregistration may be on the Open Science Framework (https://osf.io/) or other recognized repository. As with all Psychological Science submissions, the primary criterion is general theoretical and empirical significance. Direct replications should reproduce the original methods and procedures as closely as possible, with the goal of measuring the same effect under essentially (but not necessarily superficially) the same conditions as in the original study.
Researchers undertaking a PDR project are encouraged to consult with the author or authors of the original article and to submit a proposal to the Editor-in-Chief. As per the procedure for Registered Reports as described at https://cos.io/rr/, researchers are required to make a Stage 1 submission featuring an Introduction and Methods prior to data collection. Stage 1 submission is an essential step: PDR submissions that have not been through Stage 1 review prior to data collection will not be considered. Stage 1 submissions that are judged by the editorial board to be of sufficient quality and within journal scope will be sent for extended peer review. In considering Stage 1 submissions, reviewers will be asked to assess the importance of the research question(s); the logic, rationale, and plausibility of the proposed hypotheses; and the soundness and fidelity of the methodology and analysis plan. Assuming provisional acceptance of the PDR protocol (Stage 1 acceptance), authors are required to register their approved protocol on the Open Science Framework (https://osf.io/) or other recognized repository (publicly or under private embargo that is retained until final acceptance of the Stage 2 manuscript). Authors may use any recognized registry, but the accepted protocol can be easily registered using a tailored mechanism for Registered Reports on the Open Science Framework (https://osf.io/rr/). Following registration, authors of RR-PDRs will conduct the study as planned. Once the study is complete, authors prepare and resubmit their manuscript for full review (Stage 2 submission/review).
Figure and/or Table Limit
What counts toward the word limit?
Introduction & Discussion
Method & Results
Notes, Acknowledgments, Appendices
Cover Page, Abstract, Statement of Relevance, Author Contributions, References
Commentary or Reply to Commentary
All main text, including notes, acknowledgments, and appendices count toward the word limit**
Preregistered Direct Replication
*These are not hard-and-fast limits and editors have the flexibility to allow more references if they are necessary to establish the scientific foundation for the work.
**For commentaries reporting new data, Method and Results sections are not included in the word count.
Manuscripts published in Psychological Science must follow the style of the Publication Manual of the American Psychological Association, 6th edition, with respect to handling of the order of manuscript sections, headings and subheadings, references, abbreviations, and symbols. Please embed tables and figures within the main text. For initial submissions, authors may deviate from some of the style requirements (e.g., heading and subheading style, reference format, location of tables and figures). However, invited revisions and final versions of manuscripts must follow APA style. For all article types, for initial review, manuscripts must be masked as to authors and originating institutions. To facilitate this approach, authors will be asked to upload a masked version of the submission.
Authors are encouraged to consult the APA Guide for New Authors for more information on the manuscript-submission and peer-review process.
Further guidance can be found on our Manuscript Structure, Style, and Content Guidelines page.
You may upload your manuscript and ancillary files as Word .doc or .docx, as .rtf, as .pdf, or as .tex. If you submit a .tex file, please also submit a PDF file conversion, as the submission portal cannot render .tex files in the PDF proof.
Consistent with the Guidelines for Transparency and Openness Promotion (TOP) in Journal Policies and Practices, all data, program code and other methods should be appropriately cited. Such materials should be recognized as original intellectual contributions and afforded recognition through citation.
- All data sets and program code used in a publication should be cited in the text and listed in the reference section.
- References for data sets and program code should include a persistent identifier, such as a Digital Object Identifier (DOI). Persistent identifiers ensure future access to unique published digital objects, such as a text or data set. Persistent identifiers are assigned to data sets by digital archives, such as institutional repositories and partners in the Data Preservation Alliance for the Social Sciences (Data-PASS).
- Data set citation example:
Campbell, Angus, and Robert L. Kahn. American National Election Study, 1948. ICPSR07218-v3. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 1999. http://doi.org/10.3886/ICPSR07218.v3
The journal requires that for accepted manuscripts, figures be embedded within the main document near to where they are discussed in the text. A figure’s caption should be placed in the text just below the figure. For initial submissions, tables and figures may be placed at the end of the manuscript.
Authors who are submitting revisions should also upload separate figure files that adhere to the APS Figure Format and Style Guidelines. Submitting separate, production-quality files helps to facilitate timely publication should the manuscript ultimately be accepted.
Psychological Science recommends the use of the “new statistics”—effect sizes, confidence intervals, and meta-analysis—to avoid problems associated with null-hypothesis significance testing (NHST). Authors are encouraged to consult this Psychological Science tutorial by Geoff Cumming, which argues that estimation and meta-analysis are more informative than NHST and that they foster development of a cumulative, quantitative discipline. Cumming has also prepared a video workshop on the new statistics that can be found here.
Authors must include effect sizes for their major results and distributional information in their tables and graphs. Fine-grained graphical presentations that show how data are distributed are often the most transparent way of communicating results. Please report 95% confidence intervals instead of standard deviations or standard errors around mean dependent variables, because confidence intervals convey more useful information—another point discussed in Cumming’s tutorial.
Reporting Statistical Results
The abstract should include information about the sample size(s) in studies reported in the manuscript. Please report test statistics with two decimal points (e.g., t(34) = 5.67) and probability values with three decimal points. In addition, exact p values should be reported for all results greater than .001; p values below this range should be described as “p < .001.” Authors should be particularly attentive to APA style when typing statistical details (e.g., Ns for chi-square tests, formatting of dfs), and if special mathematical expressions are required, they should not be graphic objects but rather inserted with Word’s Equation Editor or similar.
Studies involving fMRI or other neuroimaging methods typically entail larger numbers of measures than those found in behavioral research. Analyses of neuroimaging data have their own well-developed statistical frameworks and reporting standards. We recognize that the focus of fMRI analyses is often not on effect sizes, but rather on (a) statistical reliability and (b) replicability. We refer authors to Poldrack et al. (2008) for a useful list of reporting guidelines for such analyses.
We also recognize that adherence to the kind of complete reporting suggested by Poldrack et al. (2008) may require partitioning some of this material to Supplemental Online Materials (SOM). Most of the items listed by Poldrack et al. (2008) in their Appendix A can be succinctly and completely described in the main text. Exceptions that should be put into SOM are sections of the appendix labeled “Intersubject registration” and “Statistical modeling” (if nonstandard).
For region of interest (ROI)-based analyses, the process of ROI selection should be clearly stated in the main text; in particular, it should be noted whether the ROI was selected prior to any analyses of the data. Reports of ROI analyses should include effect sizes. Any ROI-based analyses should be supplemented by whole-brain analyses. A complete table of activation coordinates, together with their statistics, should be provided; however, such tables should typically be put in SOM.
The strongest submissions will address both reliability and replicability and will report results for appropriately large samples and/or a replication study. For instance, a study concerning correlations of brain activation with behavioral measures may typically require a sample in excess of 100 subjects, although we recognize that sample size will vary widely depending on the details of the study. Power analyses are recommended (see Mumford, 2012).
Psychological Science will place emphasis on those functional neuroimaging studies that make a clear and compelling contribution to understanding psychological mechanisms, above and beyond a purely neuroanatomical contribution. Therefore, authors should carefully support reverse-inference statements in their Results and Discussion sections. Such support could come from strong prior results in the literature, as well as from separate meta-analyses (see Poldrack, 2011).
The editors of Behavior Genetics have established perceptive policies regarding candidate gene association and Candidate Gene × Environment interaction studies of complex traits (Hewitt, 2012). Submissions to Psychological Science that report similar candidate-gene studies are expected to follow those policies.
Submitting authors must declare that they have disclosed (a) all of the dependent variables or measures collected, (b) all of the conditions/groups/predictors tested for each study reported in the submitted manuscript, and (c) any data exclusions (subjects or observations). The Disclosure Statement section looks like this:
For all studies reported in your manuscript, check the boxes below to confirm that:
- All dependent variables or measures that were analyzed for this article’s target research question have been reported in the Methods section(s)
- All levels of all independent variables or all predictors or manipulations, whether successful or failed, have been reported in the Method section(s)
- The total number of excluded observations and the reasons for making those exclusions (if any) have been reported in the Method section(s)
Sample Composition and Size
Submitting authors are asked to (a) identify the participant population in the abstract; and (b) in the methods, explain the basis(es) for the composition of their samples (whether the sample was selected for specific theoretical or conceptual reasons, is a sample of convenience, etc.). In the Discussion (or General Discussion), authors are asked for explicit consideration of the limits on the generalizability of their findings. Authors also should explain the basis(es) for the sample sizes in the studies included in the submission. Bakker et al. (2016) reported evidence that many published research psychologists have faulty intuitions regarding statistical power. For many years it was standard practice to conduct studies with low statistical power and submit for publication those studies that obtained statistically significant results (Cohen, 1969). Such practices lead to exaggerated estimates of effect size. Indeed, when statistical power is very low, only results that exaggerate the true size of an effect can be statistically significant. Therefore, it is typically not appropriate to base sample size solely on the sample sizes and/or effect sizes reported in prior research or on the results of small pilot studies (see, e.g., Gelman & Carlin, 2014). There is no single right answer to this question, but authors must explain in the manuscript the basis(es) for determination that their sample size is appropriate. If an estimate of the size of an effect is given, the unit of measurement (e.g., Cohen’s d) must be specified and some rationale for assumption that the estimate is sound must be provided. If the study tests more than one effect, authors must make clear which of those effects their power analysis was based upon.
Submitters are also asked if they conducted preliminary analyses on the data and decided whether or not to collect additional data based on the outcome of those analyses. That practice, known as “optional stopping,” inflates the risk of making a Type I error (see Simmons, Nelson, & Simonsohn, 2011).
Authorship implies significant participation in the research reported or in writing the manuscript, including participation in the design and/or interpretation of reported experiments or results, participation in the acquisition and/or analysis of data, and participation in the drafting and/or revising of the manuscript. All authors must agree to the order in which the authors are listed and must have read the final manuscript and approved its submission. They must also agree to take responsibility for the work in the event that its integrity or veracity is questioned.
Each published manuscript must include a paragraph (not included in the word count), after the body of the main text and before any acknowledgments, that states each author’s contribution. For purposes of review, this paragraph should be uploaded as a separate (supplemental) file. Here are examples of author contributions paragraphs:
“D. P. Wu developed the study concept. All authors contributed to the study design. Testing and data collection were performed by D. P. Wu. D. P. Wu and A. C. Brown performed the data analysis and interpretation under the supervision of H. L. Andreas. D. P. Wu drafted the manuscript, and A. C. Brown and H. L. Andreas provided critical revisions. All authors approved the final version of the manuscript for submission.”
“D. P. Patel is the sole author of this article and is responsible for its content.”
Psychological Science encourages researchers to preregister their research plans and (to the extent practically feasible and ethically appropriate) to make their data and materials available to other scientists (for rationales, see Lindsay, 2015, 2017, and Lindsay, Simons, & Lilienfeld, 2016). As part of that encouragement, each manuscript reporting new empirical work is to include an Open Practices Statement in which the author(s) state whether the study(ies) reported were preregistered and whether the data and/or materials are available on a third-party permanent archive. If data and/or materials are not available on a third-party permanent archive, authors are required to inform readers how they may access study-related materials and data, or of restrictions thereon. These statements should be direct and to the point. Authors are not required to offer explanations but they may do so if they choose. Commentaries and Replies that do not report new data do not need to include an Open Practices Statement. The Open Practices Statement will have no bearing on the peer review process.
All of the following are examples of acceptable statements:
“Neither of the experiments reported in this article was formally preregistered. Neither the data nor the materials have been made available on a permanent third-party archive; requests for the data or materials can be sent via email to the lead author at [email].”
“Neither of the studies reported in this article was formally preregistered. The data have not been made available on a permanent third-party archive because our Institutional Review Board ruled that we could not post the data; requests for the data can be sent via email to the lead author. The complete questionnaires are included in the Supplemental Online Material associated with this article at [url].”
“Experiment 1 was not formally preregistered; the preregistration for Experiment 2 can be accessed at [url]. De-identified data for both experiments along with a code-book and the data analysis scripts are posted at [url]; access to the data is limited to qualified researchers. The materials used in these studies are widely available.”
“We analyzed archival data that are not under our direct control; requests to access the data should be directed to the relevant archive. Our complete analysis scripts and code book have been posted at [url].”
Questions, concerns, and suggestions regarding this Open Practices Statement can be directed to firstname.lastname@example.org.
Authors reporting research involving human subjects should indicate whether the protocol was approved by an institutional review board or similar committee and whether it was carried out in accordance with the provisions of the World Medical Association Declaration of Helsinki. Authors reporting research involving nonhuman animal subjects should indicate whether institutional and national guidelines for the care and use of laboratory animals were followed.
Identifying information of participants will not be published unless the information is necessary and written, informed consent is obtained.
Any potential conflicts of interest should be reported in the online submission process and in the manuscript. If any authors have been remunerated for advancing a particular perspective related to the research reported in the manuscript (e.g., regarding the efficacy of an intervention) that should be declared as a potential conflict when the manuscript is submitted. The Declaration of Conflicting Interests section that appears in every article will state any reported conflicts or will read “The author(s) declared no conflicts of interest with respect to the authorship or the publication of this article.”
Manuscripts should conform to the Recommendations for the Conduct, Reporting, Editing and Publication of Scholarly Work in Medical Journals, which can be found in full at www.icmje.org. In particular, authors should reference the following sections:
- II.A. Defining the Role of Authors and Contributors
- II.B. Author Responsibilities—Conflicts of Interest
- II.E. Protection of Research Participants
- III.B. Scientific Misconduct, Expressions of Concern, and Retraction
- III.K. Clinical Trial Registration (if applicable)
The APS journals follow the code of conduct of the Committee on Publication Ethics (COPE) and follow COPE guidelines when misconduct is suspected or alleged.
Authors who wish to submit to Psychological Science a manuscript that contains research previously presented at a conference must do so in accordance with the following guidelines:
If the proceedings published only the abstracts of the conference presentations, or if the conference proceedings were provided only to attendees (i.e., not made available to the public or to members of the press), authors may submit their manuscript to the journal. If the proceedings were more widely distributed or made available to the press or public, an explanation should be provided in an additional file, and the Editor-in-Chief will determine if the manuscript is eligible for consideration by the journal. If the proceedings material contained reports more substantive than just abstracts, the journal submission must be significantly different from the proceedings material in order to be considered. As a general rule, the Psychological Science submission must be at least twice the word length of the proceedings material, to provide authors the opportunity to clearly distinguish between the two reports in terms of their scientific scope and significance.
If any version or part of the submitted manuscript has been published in conference proceedings, the authors must disclose this in their submission and provide a complete reference to the publication in their manuscript. Authors must also upload the published proceedings material as supplementary material with their submission.
Authors are free to submit certain types of Supplemental Material for online-only publication. If the manuscript is accepted for publication, such material will be published online on the publisher’s Web site, linked to the article.
Psychological Science allows for the online publication of two types of supplemental online material, reviewed (SOM-R) and unreviewed (SOM-U). SOM-R includes material that has undergone both an initial review (by two members of the editorial team) and an extended review (by two or more external referees). SOM-U includes unreviewed material, or information that has not been vetted by either the editors or the external referees. Neither type of supplemental material will be copyedited or formatted; it will be posted online exactly as submitted.
The editorial team takes the adjective supplemental seriously. Both SOM-R and SOM-U should include the sort of material that enhances the reader’s understanding of an article but is not essential for understanding the article.
More about SOM-R and SOM-U:
One intuitive way to understand the SOM-R/SOM-U distinction is that SOM-R is the kind of information that you might write in a rebuttal letter to reviewers who want to see more explanation of methods or supplemental analyses, whereas SOM-U is the kind of information that you might post on your own lab’s Web site to make available background information or provide stimuli.
In SOM-R, authors may wish to provide more details on their methods and procedures—details of particular interest to specialists in the area (e.g., event-related potential measurement: Piton et al., 2000; functional MRI: Poldrack et al., 2008; structural equation modeling: Raykov, Tomer, & Nesselroade, 1991); to readers concerned with the reliability, generality, and robustness of the results (e.g., Simmons, Nelson, & Simonsohn, 2011); or to researchers who might endeavor to replicate the results for themselves. If authors have carried out conceptual or methodological replications of their own, they may wish to summarize such complementary studies in SOM-R. Given that Psychological Science places a premium on innovation and discovery, evidence that attests to the replicability of the principal results is highly valued by editors, reviewers, and readers alike. SOM-R is generally limited to 1,000 words (including text, notes, and captions for tables or figures), 10 references, and three tables or figures (combined); requests to exceed these limits must be approved in advance by the Editor-in-Chief.
Common examples of SOM-U include research stimuli, audio or video recordings, and ancillary citations; for example, authors who have reached the allowable limit of references for their type of publication (General Article, Short Report, etc.) may wish to cite additional sources as Recommended Readings within the SOM-U. References to classic as well as contemporary contributions in the relevant research area are especially desirable, as they provide an easy and effective way for authors to stitch their work into the broader tapestry of the field.
To reiterate a key point made earlier (see Article Types), “SOM-R is the preferred location for methodological minutiae and fine-grained details on the Results—the sorts of information that only ‘insiders’ would relish and require for purposes of replication.” Authors are strongly advised to take full advantage of SOM-R for this purpose, as it allows the editors to make efficient use of their limited annual allotment of printed pages.
If you intend to upload SOM-R or SOM-U, please read the Guidelines for Publication of Supplemental Online Material, which describes conventions for naming files and for citing Supplemental Materials in the manuscript. SOM-R and SOM-U files should be uploaded when the manuscript proper is submitted.
English Language Help
Authors who would like to refine the use of English in their manuscripts might consider using the services of a professional English-language editing company. A listing of some of these companies follows. Please be aware that the journal makes no endorsement of any of these companies. An author’s use of these services in no way guarantees that his or her submission will ultimately be accepted. Any arrangement an author enters into will be exclusively between the author and the particular company, and any costs incurred are the sole responsibility of the author.
- Academic Language Experts
- American Journal Experts
- ATECS – Text Editing
- Charlesworth Group
- Clark Scientific Editing
- Dragonfly Freelance Writing and Editing Services
- SAGE Language Services
- SPI Global Professional Editing Services
- Cambridge Proofreading
Contributors are encouraged to consult the Contributor FAQ before submitting manuscripts to Psychological Science.
Manuscripts accepted for publication are eligible to earn one or more Open Science Badges. Badges are awarded for promoting openness in science through sharing of data and/or materials, and making study methods, procedures, and analyses more transparent. They are neither a direct nor indirect means of asserting that publications with one or more badges are of higher quality than publications without them.
The following badges are available:
- Open Data badge for making publicly available the digitally shareable data necessary to reproduce the reported result. This includes annotated copies of the code or syntax used for all exploratory and principal analyses. If the data contains sensitive, personal information, a PM (Protected Access) notation will be added if the authors post their data according to the guidelines regarding protected access repositories (see below for more information about this notation).
- Open Materials badge for making publicly available the digitally shareable materials/methods necessary to reproduce the reported results. Authors are also encouraged to make publicly available video recordings of their study procedures; in return, a (BM)Visualized Methods notation will be added to the authors’ Open Materials badge (see below for more information about this notation).
- Preregistered badge for having a preregistered design and analysis plan for the reported research and reporting results according to that plan. An analysis plan includes specification of the variables and the analyses that will be conducted. Please note that “Preregistration” does not require all analyses to be confirmatory (planned in advance); it merely requires investigators to state up front which analyses are confirmatory and which are exploratory. High-quality exploratory research is more than welcome in the pages of Psychological Science, as long as authors explicitly acknowledge that the analyses are exploratory and, when relevant, acknowledge potential constraints on the replicability of these findings. If the analysis plan was registered prior to observation of outcomes, the Open Practices note will include the notation DE (Data Exist). If there were strongly justified changes to an analysis plan, the Open Practices note will include the notation TC (Transparent Changes). Authors who have additional unreported registrations or unreported analyses without strong justification (as determined by the Editor-in-Chief) will not qualify for a badge. Please see former Editor-in-Chief Steve Lindsay’s statement on preregistration for more information.
To apply for one or more of these badges acknowledging open practices, authors must provide the information requested in the Open Practices Disclosure form, which is sent to all authors of accepted manuscripts. Unless the authors decide not to apply for badges, the form will be published with the article as supplemental online material.
Badge icons will be displayed at the beginning of the article, and information related to open practices will be published in a note titled “Open Practices” that will appear at the end of the article. Badges are awarded following the disclosure method, in which authors provide public statements affirming achievement of badge criteria.
The criteria for earning badges and the process by which they are awarded, along with answers to frequently asked questions, are described in the Open Science Framework wiki. Please see former Editor-in-Chief Eric Eich’s Observer interview for more information.
More about the Protected Access notation (Open Data):
The Protected Access notation may be added to Open Data badges if sensitive, personal data are available from an approved protected access repository. These repositories manage access to such data to qualified researchers who complete a documented process which the repository publicly describes. This notation is not available to researchers who state that they will make “data available upon request” and is not available if requests for data sharing are evaluated on any criteria beyond considerations for compliance with proper handling of sensitive data. Please view the Approved Protected Access Repositories list for more information.
More about the Visualized Methods notation (Open Materials):
Even with uncapped Method sections in Research Articles and Research Reports (see Table 1), there is only so much an author can convey through words; researchers who want to follow up on someone’s paper might benefit by seeing how things were actually done. Videos of study procedures could also serve as valuable teaching tools for psychology students, undergraduate and graduate alike.
Making such videos publicly available should facilitate obtaining an Open Materials badge, but would not necessarily be sufficient to earn one. For instance, in the case of a study with a computer-delivered behavioral task, the script would need to be available in order for another researcher to reproduce the procedure; a video of someone completing the computer task would not be sufficient. However, the video would make it a good deal easier for some kinds of studies to earn the Open Materials badge, such as those that entail a social interaction of some kind.
By awarding an Open Materials badge with Visualized Methods notation, Psychological Science aims to promote open behavior and to recognize that a video can be very useful for certain manuscripts; however, the journal is not trying to say that a manuscript without a video is somehow inferior to one with a video (which may not make sense in all cases).
Journal staff will contact the corresponding authors of accepted manuscripts with details on the badge-awarding process.
Embargo Policy and Media Relations
Psychological Science does not impose media embargoes. In accordance with our mission of sharing the science with the public, APS may in some cases publicly disseminate information about the content of accepted articles before they are actually published in the journal. Authors are free to disseminate to colleagues and media outlets information about a forthcoming article that they have contributed to Psychological Science as soon as the manuscript has been accepted and they have completed the Contributor Publishing Agreement form. Media or press-office inquiries should be directed to Leah Thayer, Senior Director of Communications, at email@example.com.
Contributor Publishing Agreement
In order for SAGE to proceed with publication of your article, you must complete a Contributor Publishing Agreement online. You can find this form in your Author Center at http://mc.manuscriptcentral.com/psci. Within your Author Dashboard is the “Manuscripts with Decisions” queue, where you will be able to access the “Contributor Form” link within the “Action” column for your accepted manuscript. Please note that without a completed agreement, we are unable to proceed with publication of your article.
If your accepted manuscript contains third-party material requiring permission, please forward all permission agreements to the editorial office (firstname.lastname@example.org) within 5 days of signing the Contributor Publishing Agreement.
If a figure or video includes an image of a person, the authors must obtain a signed Audio/Visual Likeness Release Form from each person appearing in the figure or video before the article can be published. This is also true for photographs or video of celebrities. Please contact the editorial office (email@example.com) if you have any questions.
StatCheck is an R program that is designed to detect inconsistencies between different components of inferential statistics (e.g., t value, df, and p). StatCheck is not designed to detect fraud, but rather to catch typographical errors (see https://mbnuijten.com/statcheck/ for more about StatCheck). Authors of accepted manuscripts must also provide a StatCheck report run on the accepted version of the manuscript that indicates a clean (i.e., error-free) result. A web app version of StatCheck can be accessed at http://statcheck.io/. If StatCheck does detect errors in the accepted version of the manuscript, authors should contact the action editor directly to determine the best course of action.
Authors of accepted manuscripts must be prepared to provide production-quality figure files to editorial office staff. This typically means high-resolution [> 300 dots/in. (DPI)] JPEG files for image elements and editable files for graphs or other line drawings. Please see our Figure Format and Style Guidelines for more information.
Please note that the official acceptance date will reflect the day our editorial office has all the files necessary to begin the production process—including the Contributor Agreement, any permissions documentation, and production-quality figure files—rather than the date the acceptance letter was sent to authors.
A member of APS’s production team will contact you regarding copyediting of your manuscript. Please note that copyeditors edit accepted articles—often extensively—so that they will be clear and accessible to all readers of Psychological Science.
Funder Mandates/Open Access
The APS journals offer both green and gold open-access options that enable authors to comply with mandates from funders such as the National Institutes of Health, Wellcome Trust, and RCUK.
APS and SAGE can help fulfill many funders’ mandates to archive your accepted manuscript by making your article open access and depositing your manuscript files in PubMed Central. NIH-funded manuscripts submitted to Psychological Science after September 1, 2013, will be deposited into PubMed Central upon acceptance for publication as long as the authors indicate the funding during the submission process. Authors who wish to pay to make an article/manuscript publicly available immediately upon publication in order to comply with NIH or similar requirements may use the SAGE Choice option (gold open access).
Note that authors who do not choose to participate in SAGE Choice must choose a 12-month embargo for manuscripts submitted to PubMed Central.
For more information on open-access options and compliance at SAGE, including author self-archiving deposits (green open access) or SAGE Choice (gold open access), visit SAGE Publishing Policies on the Journal Author Gateway.
- Distribute photocopies of the published article for teaching purposes or to research colleagues on a noncommercial basis.
- Circulate or post the original manuscript submission (i.e., the pre-peer-review version) or an abstract of the article on any repository or Web site.
- Post the accepted (post-peer-review) version of the manuscript on your own personal Web site, your department’s Web site, or the repository of your institution.
At any time after publication, you may use the final published version of the article in a book you write or edit without seeking permission.
One year after publication, you may also
- Post the accepted version of the article in any repository or Web site not listed above.
You may not post the final published article on a Web site or in a repository without permission from SAGE.
When posting or reusing the article, please provide a link to the appropriate DOI for the published version.
Please note that the SAGE-created PDF of the final published article may not be posted elsewhere at any time.
For any use not detailed above, please contact SAGE at firstname.lastname@example.org. Please forward to SAGE all inquiries and requests received from third parties for permissions, reprint rights, subsidiary rights licenses, and all other use and licensing of the article.
If you discover an error in your published article, please email email@example.com immediately. The journal’s managing editor will work with you and the Editor-in-Chief to determine whether a correction should be made and what form it should take. An erratum corrects an error made by APS or the publisher; a corrigendum corrects an error made by the author(s).
A correction notice will be published if an error affects the publication record, the scientific integrity of the article, or the reputation of the authors or the journal. In general, Psychological Science will not publish a formal correction for spelling or grammatical errors or for errors that do not significantly affect an article’s findings or conclusions or a reader’s understanding.
If a correction notice is published, a new, corrected version of the article will be posted online unless there is no obvious corrected version to replace the original. For example, if the purpose of the correction is to acknowledge work that was not cited but perhaps should have been, there will be a correction notice but no new online version.
OnlineFirst Publication and TWiPS
All accepted manuscripts are published online (OnlineFirst) as soon as they reach their final copyedited, typeset, and corrected form, and each accepted article appears in a monthly print issue of Psychological Science as well as in the digital This Week in Psychological Science (TWiPS), which is distributed weekly to all APS members.
After you have submitted your Contributor Publishing Agreement, you may be contacted by an editorial assistant requesting the proper files needed for production. (If you do not receive a message from the editorial office after you submit the Contributor Publishing Agreement, it means you have already provided all the files we need for production – thanks!) Once your article has entered production, you can expect to hear from a copyeditor within approximately 4 weeks. Every manuscript receives a thorough, substantive edit, and the manuscript is returned to the corresponding author for review before it is typeset. After the copyediting process is complete, your article should be published online within approximately 3 weeks, depending on how promptly proof corrections are returned.