APS: Leading the Way in Replication and Open Science
The Association for Psychological Science (APS) promotes replication and open science practices as part of a broader effort to strengthen research methods and practices across all areas of psychological science. The Association’s efforts date back a number of years, to when few other organizations were addressing these issues, and the Association pioneered several innovations that have since been widely adopted. Below are just some of APS initiatives in these areas.
Journal Policies and Programs
- APS Open Practice Badge Program. Recognizing that research transparency is a core component of improving reproducibility, APS awards authors with graphical “badges” on published articles if the authors make their data or materials openly accessible or preregister their experiments (i.e., prespecify their experiment method before the study is conducted). Scholars have since noted that the instituting of the Open Practice Badges has been linked with increased data and materials sharing rates (see Kidwell et al., 2016; Giofrè, Cumming, Fresc, Boedker, & Tressoldi, 2017).
- Advances in Methods and Practices in Psychological Science (AMPPS). APS’s newest journal, AMPPS, is a new home for reporting innovative developments in research methods, practices, and conduct, and will house other articles related to replicability. The first issue of AMPPS is slated for early 2018.
- Registered Replication Reports (RRRs). APS publishes RRRs, which are highly powered multi-lab attempts designed to replicate central findings in psychology. These projects aim to bring the original study author together with replication teams to discover precise estimates of effect sizes of phenomena of interest. See below for links to individual RRRs:
- Preregistered Direct Replications. APS publishes Preregistered Direct Replications, which are replications of studies published previously in Psychological Science. These aim to follow the same methods and procedures as the original study.
- Psychological Science Statistical Advisers. The editor of our journal Psychological Science, D. Stephen Lindsay, has six Statistical Advisers on his editorial team to provide additional expertise reviewing manuscripts employing sophisticated statistical or methodological techniques.
- StatCheck program for accepted articles. Psychological Science editorial staff use the R program StatCheck—similar to word processor spellcheck software—to help catch errors in statistical reporting in accepted manuscripts.
- Transparency and Openness Promotion (TOP) Guidelines. APS is an original signatory to the TOP Guidelines, a document for scientific publishers and journals that encourages consideration of factors believed to relate to replicability, including citation standards; data, materials, and code availability; and preregistration of studies and analysis plans.
- San Francisco Declaration on Research Assessment (DORA). APS is an early signatory of this worldwide initiative involving editors and publishers of scholarly journals. DORA, initiated by the American Society for Cell Biology in 2012, recognizes the need to improve the ways in which scientific research is evaluated by funding agencies, academic institutions, and other entities.
Journal Editorials & Other Articles
- “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” This popular Psychological Science paper, a citation classic, showed that common scientific practices potentially increase the likelihood of false positives in research.
- “Business Not as Usual.” In this editorial, past Psychological Science Editor Eric Eich discusses introducing series of initiatives dedicated to strengthening replicability. These initiatives include improving the requirements for the reporting of methods within Psychological Science and using expanded statistical techniques to assess effect sizes, confidence intervals, and apply meta-analysis in papers.
- “Replication in Psychological Science.” In this editorial, Psychological Science Editor D. Stephen Lindsay highlights four issues that are important to improving replicability in psychological science, including low statistical power, practices that inflate the false positive rate, and incorrect interpretation of correlations.
- “Sharing Data and Materials in Psychological Science.” In this editorial, Psychological Science Editor D. Stephen Lindsay discusses important changes at the journal designed to increase the frequency and ease with which editors and reviewers of submissions to Psychological Science can access data and materials.
- Special sections on replication in Perspectives on Psychological Science. This journal often features special sections on what researchers can do to improve replicability in their own work. Similar sections will soon be found in AMPPS.
Magazine Articles and Editorials
- “Report Demonstrates Need for Improved Reproducibility in Psychological Science.” APS covers the results of the Reproducibility Project: Psychology, which showed that the replicability of some psychological science findings were lower than anticipated.
- “Robust Science Depends on Understanding the Science of Humans.” National Science Foundation Division of Behavioral and Cognitive Sciences Director Howard Nusbaum discusses his perspective on replicability from his vantage point.
- “APS and Open Science: Music to Our Ears.” APS Executive Director Emeritus Alan Kraut’s column in our Observer membership magazine shares the narrative of APS’s central involvement in discussions surrounding improving replicability since 2003.
- “Preregistration, Replication, and Nonexperimental Studies.” APS Past President Susan Goldin-Meadow addresses scientists’ concerns about the risks of marginalizing studies that don’t fit well with preregistration protocols.
- “Seven Selfish Reasons for Preregistration.” An illustrated guide to the career benefits of submitting research plans before beginning data collection.
- “Research Preregistration 101.” APS journal editors explain the rationale for and benefits of preregistration.
- “Powerful Tools for Designing Powerful Studies.” Psychological scientists offer open-source tools to help researchers ensure their studies are adequately powered.
Convention Programs & Workshops
- “The New Statistics: Estimation and Research Integrity.” This six-part video series, recorded at the 2014 APS Annual Convention, features quantitative psychologist Geoff Cumming discussing the value of assessing effect sizes and confidence intervals in the analysis process.
- “Improving the Reproducibility of Our Research Practices.” This six-part video series, recorded at the 2016 APS Annual Convention, features Brian Nosek and Courtney Soderberg discussing laboratory and personal research practices that improve the reproducibility of research.
- Symposia and Workshops at Annual and International Conventions. APS frequently hosts presentations on replicability at its academic meetings. Please visit our Conventions page to learn more.
- APS Convention Presentation Repository. Share your Convention talk or poster materials with other scientists and the public through the Open Science Framework (OSF).
Congressional and Federal Activities
APS has worked with Congress to increase federal agency support for replication and reproducibility. Thanks to efforts of APS and other organizations, comments on replicability have appeared in the report language of many Congressional appropriations reports, including the following selected language:
- House FY 2017 Labor-HHS Appropriations Report (House Rpt. 114-600): “Reproducibility of Scientific Methods.—The Committee requests an update on the progress made and the plan for additional activities in the fiscal year 2018 budget request.
- House FY 2016 Labor-HHS Appropriations Report (House Rpt. 114-195): Reproducibility of Scientific Methods.—The Committee notes that the gold standard of good science is the ability of a lab to reproduce a method and finding and is therefore continues to be concerned with reports that some published biomedical research cannot be easily reproduced. The Committee expects NIH to continue to stress the importance of experimental rigor and transparency of reporting of research findings in order to enhance the ability of others to replicate them.
- Senate FY 2015 Commerce-Justice-Science Report (House Rpt. 113-448): Replication of scientific research.—The Committee concurs in the view that the gold standard of good science is the ability of a research lab to reproduce a method and finding, and shares the growing concern that a significant amount of recent research cannot be easily reproduced. The Committee is therefore pleased that NSF recently convened a comprehensive workshop on ‘‘Robust Research,’’[1] which included representatives of NSF, NIH, OSTP and non-governmental scientific organizations and individual experts, to discuss the magnitude of the issue of replicability and to explore solutions to promote rigor and transparency in research.
- House FY 2015 Omnibus (House Rpt. 113-483): Reproducibility of Research Results.—The agreement expects NIH to stress the importance of experimental rigor and transparency of reporting of research findings in order to enhance the ability of others to replicate them. The agreement concurs in the view that the gold standard of good science is the ability of a lab to reproduce a method and finding and is therefore concerned with reports that so much published biomedical research cannot be easily reproduced. The agreement expects that NIH will develop incentives for scientists to undertake confirmation studies, best practice guidelines that would facilitate the conduct of replicable research and guidelines to encourage research transparency in the reporting of methods and findings. In addition, the agreement expects an NIH-wide policy and trans-NIH oversight to address the replication concerns. The agreement requests an update in the fiscal year 2016 budget request on the activities NIH has on-going toward this effort, the annual measure and amount of resources spent or estimated each year toward this effort.
- Senate FY 2013 Labor-HHS Appropriations Report (Senate Rpt. 112-176): False Positives and Replications.—The Committee supports NIH’s effort to develop a consensus on the issues of false-positive research results. This effort will encourage policies on the publishing of replications (and nonreplications) of previous research and advance scientific knowledge.
[1] This report is the “Social, Behavioral, and Economic Sciences Perspectives on Robust and Reliable Science.” APS members Kenneth Bollen and John Cacioppo, along with others, led this pivotal report to the National Science Foundation (NSF) which helped set an agenda for ways in which NSF can help improve replicability.
Comments
I think the obsessions with pre-registering studies and conducting huge replication projects of single studies are seriously misguided. Science is basically inductive and forcing deductive hypothesizing simply short cutes the process. Replicating a single study 100 times is not useful. Replication with variation is needed based on many studies to see if a single study is idiosyncratic to the initial design.
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.