Optimizing Psychological Science’s Impact on Public Health
Researchers can spend decades building evidence to support behavioral treatments without ever seeing their work produce noticeable benefit to public health. The Institute of Medicine refers to this gap as the “Quality Chasm.” The field of dissemination and implementation (D&I) science emerged to bridge this chasm, with the express purpose of translating discoveries from lab to practice. According to the National Institutes of Health (NIH), dissemination research is the study of targeted distribution and packaging of intervention materials. On the other hand, implementation research is the study of strategies used to integrate evidence-based practices into community settings to improve patient outcomes. Although D&I has existed as a field for fewer than 20 years, it has a dedicated journal (Implementation Science), a standing cross-cutting NIH study section (Dissemination & Implementation Research in Health), a twice-renewed NIH program announcement (R01, R34, and R21 mechanisms) with support from 18 institutes, and at least 42 initiatives devoted to advancing the field at the regional, national, and international levels.
The Society for Implementation Research Collaboration (SIRC) is one such initiative. SIRC originated out of a biennial conference series that began in 2010 with $150,000 from the National Institute of Mental Health (NIMH). Recognizing that D&I studies were proliferating in silos, SIRC’s founders aimed to bring scientists and practitioners together in developing a research agenda centered on common measures, methods, and research principles. In doing so, they hoped to improve both the frequency and quality of evaluations of evidence-based practice implementation. SIRC became a formal society in 2015 and it now has more than 480 members from the United States, Canada, Great Britain, Canada, Ireland, Australia, Austria, Portugal, South Africa, Zimbabwe, Chile, Denmark, Norway, India, Jamaica, Kenya, the Netherlands, Pakistan, Sweden, and Singapore. The work being done by D&I researchers today has the potential to improve the lives of many — but, like any new field, D&I faces particular challenges that must be tackled.
Improving Measurement
Psychological scientists have long evaluated intervention fidelity, but the field has devoted less attention to other implementation outcomes such as acceptability, feasibility, appropriateness (i.e., compatibility with the given setting, stakeholders, or problem) adoption (i.e., a commitment or willingness to use a new practice), penetration (i.e., the extent to which a practice is integrated within a service setting), cost, and sustainability. The result is insufficient measurement tools; several outcomes have no measures, others have only invalidated tools, and those that do exist are difficult to locate. With NIMH funding, several SIRC officers created a repository of more than 400 measures of constructs relevant to implementation, as well as three new measures of implementation outcomes. Spurred by the belief that measures ought to inform rigorous research evaluation and guide practical implementation, this initiative also generated the Psychometric and Pragmatic Evidence Rating Scale (PAPERS), which helps users assess the quality of existing measures and develop new ones.
Growing the Workforce
To ensure that our burgeoning field continues to grow, we need to expand the roster of scientists with D&I training. A review of D&I training opportunities revealed that only 26 psychological scientists working in clinical psychology programs are potentially contributing to D&I training at the predoctoral level. And a recent social network analysis identified 20 D&I leaders who serve as sources of advice or who connect researchers in the field, only five of whom are psychological scientists and only one of whom works in a psychology department (and thus can train future psychological scientists). The other leaders have primary appointments in psychiatry, public health, or family medicine, or at a research institute affiliated with a large health system.
In response to this shortage, the Delaware Project was established in 2011 to help integrate D&I within a broader stage model, receiving joint sponsorship from the Academy of Psychological Clinical Science, NIMH, the National Institute on Drug Abuse, and the Office of Behavioral and Social Sciences Research. Given that many psychology departments do not house a D&I scientist, the Delaware Project works to accumulate resources, such as syllabi and lectures, and make them publicly accessible. To foster training across institutions, SIRC also offers a mentoring program composed of three tiers: students, new investigators, and established investigators. Each of the lower tiers receives one-on-one mentoring from the tier above, and the program matches mentors and mentees according to their research interests. The mentoring relationships focus on each mentee’s specific needs, whether it’s support with career development, grant writing, manuscript development, or other activities.
Balancing Study Design
In designing D&I studies, researchers must decide how to test evidence-based practices in real-world settings that present a variety of limitations. These decisions can be challenging to make, requiring researchers to balance concerns about internal and external validity. To facilitate the process, SIRC provides support to researchers and practitioners through conference-based and online workshops. These structured workshops provide a forum for presenters to pitch their projects in development, such as grant proposals or implementation practice projects, and receive feedback from the members of SIRC’s network of experts. A mixed-methods evaluation of past workshops indicated that participants were very satisfied and that presenters had high rates of external funding (e.g., 35.3% of projects were funded; 41.2% were not funded; 23.5% planned to resubmit), often from NIH.
Establishing Mechanisms
While there are at least 61 models available to guide D&I studies, the field sorely needs theory to guide evaluation. Two systematic reviews examining implementation mechanisms identified 31 studies across seven countries, none of which empirically established a mechanism of change. Without theory, implementation targets and mechanisms have largely gone unarticulated, leading to a proliferation of multifaceted strategies that seem to take a “kitchen sink” approach. As a result, implementation strategies become increasingly complex and costly but not necessarily more effective with respect to the outcomes of interest. Being able to focus on components that are known to operate through established mechanisms allows implementers to streamline their strategies. SIRC continues to promote the study of implementation mechanisms to expedite progress in this critical area.
Connecting Stakeholders
The D&I work being done today cannot come to fruition if the stakeholders who study, carry out, and are affected by program implementation are not talking to one another. The activities mentioned above purposefully include all potential stakeholders with the aim of reducing the gaps between them. At the 2017 conference, SIRC invited policy makers, intermediaries, and practitioners to help guide the society in addressing this growing divide. Going forward, SIRC’s conference will be cochaired by a researcher and a practitioner to ensure that the practical implications of research are clearly articulated. And we will continue to identify strategies for communicating the latest in implementation science to on-the-ground practitioners, to ensure that research informs practice and practice informs research.
SIRC is also developing a new interdisciplinary journal focused on behavioral health implementation. Tentatively titled Behavioral Health Implementation Research, the journal will invite manuscripts that feature a setting, outcome, or practice relevant to behavioral health. For instance, we welcome behavioral health implementation research across a wide spectrum of clinical and service settings, including specialty mental health, medicine, criminal justice, education, integrated care, and social services. We view behavioral health outcomes as including, but not limited to, mental health, substance use disorders, and social and role functioning, as well as comorbid chronic diseases. We are interested in behavioral health practices that are typically complex, multicomponent, psychosocial interventions. The Society’s journal steering committee is committed to ensuring that the journal is governed by, contributed to, and consumed by both researchers and practice partners.
Bringing Psychological Science
to the Table
SIRC’s growth has been rapid and it parallels that of the larger D&I field, driven by the need to achieve a substantial return on taxpayer-funded research. The demand for D&I research by federal and foundation funders is strong and unlikely to go away. Given that the focus of this work is behavioral change, psychological science is foundational for much of D&I’s knowledge and methods. Yet psychological science arguably does not have a proportional seat at the table despite an open invitation. Treatment developers continue to build efficacious interventions that could never “live” in the settings they were intended for because they fail to consider constraints at various levels, including those of the patient, provider, organization, system, and policy. Without D&I on the radar of psychological scientists, it is likely that we will continue to see a poor return on investment.
We invite you to join the conversation.
References and Further Reading
Atkins, M. S., Strauman, T. J., Cyranowski, J. M., & Kolden, G. G. (2014). Reconceptualizing internship training within the evolving clinical science training model. Clinical Psychological Science, 2, 46–57.
Bower, J. E., Crosswell, A. D., & Slavich, G. M. (2014). Childhood adversity and cumulative life stress: Risk factors for cancer-related fatigue. Clinical Psychological Science, 2, 108–115.
Darnell, D., Dorsey, C. N., Melvin, A., Chi, J., Lyon, A. R., & Lewis, C. C. (2017). A content analysis of dissemination and implementation science resource initiatives: What types of resources do they offer to advance the field? Implementation Science, 12, 137.
Institute of Medicine (IOM). (2001). Crossing the quality chasm: A new health system for the 21st century. Washington, DC: Institute of Medicine, National Academy Press.
Kazdin, A. E. (2014a). Clinical psychological science editorial: Elaboration of the publication domain and priorities. Clinical Psychological Science, 2, 3–5.
Kazdin, A. E. (2014b). Special series introduction: Reenvisioning clinical science training. Clinical Psychological Science, 2, 6–7.
Levenson, R. W. (2014). The future of clinical science training: new challenges and opportunities. Clinical Psychological Science, 2, 35–45.
Lewis, C. C., Boyd, M. R., Walsh-Bailey, C., Lyon, A. R., Beidas, R. S., Mittman, B., et al. (in preparation). A systematic review of empirical studies examining mechanisms of dissemination and implementation in health.
Lewis, C. C., Darnell, D., Kerns, S., Monroe-DeVita, M., Landes, S. J., Lyon, A. R., … Dorsey, C. (2016). Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: Advancing efficient methodologies through community partnerships and team science. Implementation Science, 11, 85.
Lewis, C. C., Mettert, K., Dorsey, C., Martinez, R., Weiner, B. J., Nolen, E., et al. (under review). An updated protocol for a systematic review of implementation-related measures. Systematic Reviews.
Lewis, C. C., Stanick, C. F., Martinez, R. G., Weiner, B. J., Kim, M., Barwick, M., & Comtois, K. A. (2015). The Society for Implementation Research Collaboration Instrument Review Project: A methodology to promote rigorous evaluation. Implementation Science, 10, 2.
Lewis, C. C., Weiner, B. J., Stanick, C., & Fischer, S. M. (2015). Advancing implementation science through measure development and evaluation: A study protocol. Implementation Science, 10, 102.
Lobb, R., & Colditz, G. A. (2013). Implementation science and its application to population health. Annual Review of Public Health, 34, 235–251.
Lyon, A. R., Comtois, K. A., Kerns, S. E. U., Landes, S. J., & Lewis, C. C. (in press). Closing the science–practice gap in implementation before it widens. In A. Shlonsky, R. Mildon, & B. Albers (Eds.), The Science of Implementation. New York, NY: Springer.
Lyon, A. R., & Koerner, K. (2016). User-centered design for psychosocial intervention development and implementation. Clinical Psychology: Science and Practice, 23, 180–200.
Marriott, B. R., Rodriguez, A. L., Landes, S. J., Lewis, C. C., & Comtois, K. A. (2016). A methodology for enhancing implementation science proposals: Comparison of face-to-face versus virtual workshops. Implementation Science, 11, 62.
National Institutes of Health. (2016). Dissemination and Implementation Research in Health (R01) PAR-16-238. Retrieved 4 January, 2017, from https://grants.nih.gov/grants/guide/pa-files/PAR-16-238.html
National Institutes of Health, Academy of Psychological Clincal Science, SAGE, & University of Delaware. (2012). The Delaware Projects: 10/11 Conference Retrieved December 11, 2017, from http://www.delawareproject.org/1011-conference/
Norton, W. E., Lungeanu, A., Chambers, D. A., & Contractor, N. (2017). Mapping the growing discipline of dissemination and implementation science in health. Scientometrics, 112, 1367–1390.
Onken, L. S., Carroll, K. M., Shoham, V., Cuthbert, B. N., & Riddle, M. (2014). Reenvisioning clinical science: Unifying the discipline to improve the public health. Clinical Psychological Science, 2, 22–34.
Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., Matthieu, M. M., et al. (2015). A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10, 21.
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., … Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38, 65–76.
Shoham, V., Rohrbaugh, M. J., Onken, L. S., Cuthbert, B. N., Beveridge, R. M., & Fowles, T. R. (2014). Redefining clinical science training: Purpose and products of the Delaware Project. Clinical Psychological Science, 2, 8–21.
Tabak, R. G., Khoong, E. C., Chambers, D. A., & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine, 43, 337–350.
Weiner, B. J., Lewis, C. C., Stanick, C., Powell, B. J., Dorsey, C. N., Clary, A. S., … Halko, H. (2017). Psychometric assessment of three newly developed implementation outcome measures. Implementation Science, 12, 108.
Weisz, J. R., Ng, M. Y., & Bearman, S. K. (2014). Odd couple? Reenvisioning the relation between science and practice in the dissemination-implementation era. Clinical Psychological Science, 2, 58–74.
Westfall, J. M., Mold, J., & Fagnan, L. (2007). Practice-based research—“blue highways” on the NIH roadmap. JAMA, 297, 403–406.
Williams, N. J. (2016). Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Administration and Policy in Mental Health and Mental Health Services Research, 43, 783–798.
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.