From Bench to Trench: NCI Building Science-at-the-Ready Solutions
One hundred fifty participants, evenly distributed among researchers, health care practitioners, and representatives of public and private funders and nonprofit public policy organizations, gathered last fall in Washington, D.C. for a “Designing for Dissemination” conference, sponsored by the National Cancer Institute (NCI), the Center for the Advancement of Health and the Robert Wood Johnson Foundation. The following is excerpted from the conference report:
“The findings and recommendations from this conference should be viewed as the first steps in a journey that will require the continued commitment of all involved to take immediate and long-term actions based on the conceptual framework presented. … “(NCI) has recognized that closing the gap between research discovery and program delivery is both a complex challenge and an absolute necessity if we are to ensure that all populations benefit from the nation’s investments in new scientific discoveries. … “It is increasingly clear that the continued investment in new discoveries in health promotion and cancer prevention and control, while absolutely necessary, is not sufficient to guarantee the adoption and implementation of evidence-based interventions to reduce the burden of cancer. This is of particular concern with respect to low-income, ethnically diverse, and otherwise under-served populations who, while bearing an unequal burden of cancer, often are slow to benefit from research discoveries. Because of this, our failure as a nation to ensure the rapid dissemination and quick implementation of evidence-based interventions has contributed to health disparities observed in cancer risk factors and cancer outcomes. … “The dissemination of evidence-based interventions may be hindered by reliance on randomized clinical trial designs as a key determinant of eligibility for inclusion in meta-analyses. At times this has created a disconnect between practitioners, who point to interventions they ‘know’ are effective based on clinical practice, and researchers, who demand evidence that may not be forthcoming. Also, interventions demonstrated to be effective in controlled clinical trials may not be as effective in real world settings. … “There must be science/technology push to prove or improve interventions for application. Interventions alone, however, are not sufficient without delivery capacity and market pull/demand. NCI has paid considerable attention to science/technology push but less attention to delivery capacity and market push. … Dissemination approaches that are likely to promote the wide adoption of individual health behavior and system changes … have yet to be systematically evaluated.” Action steps that the participating researchers committed to:
Actions participants recommended that NCI take: “The theme most commonly suggested for NCI was to join with other intermediaries and take responsibility for supporting a nationwide permanent, community-based infrastructure for supporting the implementation of research findings. … “It was widely suggested that funding support is needed to foster dissemination. For example, NCI should issue RFAs on dissemination research, require research dissemination and diffusion in all applicable RFPs, and allocate resources for this component; ensure that funding is available to focus on dissemination, and not only the research aspects of it; and require and fund the dissemination of effective interventions in existing intervention studies. … “In addition, NCI should continue and expand the dissemination supplement programs and fund supplements to small practitioner groups to subcontract with universities or consultants to provide data collection and instruction needed by researchers. Funds are also needed to train community-based practitioners for capacity building and learning tools. … “The most critical and challenging suggestion for NCI was to train/educate NCI/NIH study review sections regarding how to evaluate dissemination research using criteria other than those used for randomized controlled trials. Training and support also should be provided to researchers and practitioners regarding how to disseminate and evaluate their research. … “NCI was urged to provide a clear vision and a specific action plan for necessary stakeholder collaboration. Furthermore, NCI should provide more opportunities to develop a broader group of practitioners, researchers, and intermediaries exposed to this dissemination information. For example, involve practitioners and community partners in the research design stage, and promote researcher/practitioner partnership.” |
A two-pack-a-day smoker walks into a clinic and complains that, despite repeated tries, he just can’t seem to quit. An aide collects his medical history and other information, then enters the data on an Internet site. Minutes later, when the patient goes in for his consultation, a print-out of the best scientifically-tested intervention tailored for his case is on his physician’s desk.
That science-at-the-ready scenario is, more or less, what a New Jersey health practitioner put to the National Cancer Institute’s Jon Kerner not long ago when Kerner explained his vision for NCI’s translational research efforts. He asked: “Is this what you are going to do?”
“We’re not there yet,” Kerner conceded, “but you’ve just described our goal.”
Kerner is Assistant Deputy Director for Research Dissemination and Diffusion (government shorthand: ADD for RDD) in the NCI’s Division of Cancer Control and Population Sciences (CCPS). In that role he developed and coordinates a program called TRIO – Translating Research into Improved Outcomes.
NCI has a long history of trying to accomplish just that, translating scientific findings into better real-world outcomes. The history is summed up in a report on a conference last year that tackled the issue head-on:
“Over the years, NCI and its relevant cancer control divisions have undertaken efforts to accelerate the dissemination of proven interventions. These have included the Community Based Cancer Control Programs in the 1970s, the Community Clinical Oncology Programs (CCOPs) in the 1980s, and Prescribe for Health in the 1990s. While we have learned modestly from each of these efforts and others, none has achieved all that was hoped.”
Not that NCI doesn’t fund studies that test real-world interventions on real-world patients. It supports a large infrastructure that includes 61 cancer centers that serve as research test-beds, treating patients within a research context, and a grants program called SPORES, for Special Programs of Research Excellence. But getting them from test-beds to practice in community cancer control clinics and other health-care settings is a different matter.
A BASIC DIFFERENCE
Along the continuum that extends from the laboratory to real-world treatments, NCI draws a sharp distinction between translating basic research into evidence-based interventions and translating – “disseminating” is the more accurate term here – that intervention research into everyday practice. The report of last year’s meeting described the semantic confusion this way: “One person’s cancer prevention and control intervention is another’s dissemination intervention and, like beauty, dissemination sometimes appears to be in the eye of the beholder.”
“Some people who are describing interventions to get people to change behavior think they are talking about dissemination, but in our framework, that is ‘intervention’ research,” explains CCPS Director Robert Croyle. “Dissemination research is determining how you get an effective intervention that is shown to change behavior adopted into the systems that pay for or directly deliver interventions.”
Construction of the translation bridge from basic to applied research is much further along than that between applied research and actual practice, but even in the former much more translating needs to be done, all agree.
For example, Michael Stefanek, chief of the Basic Biobehavioral Research Branch within CCPS, says translating what we know about basic psychoneuroimmunology (PNI) mechanisms into knowledge about how PNI affects health has only just begun, largely because PNI “has been seen as somewhat on the fringe of ‘real’ medical science historically. The connections among the brain, emotions and the immune system have been viewed with some skepticism by biomedical scientists, likely due to the inclusion of ’emotions’ in this mix.”
He says some progress is evident, however. “Clearly, work over the past several years has been quite rigorous, and with the more recent exciting work involving the role of cytokines in health and morbidity, along with work linking mood to cytokine function, the door has opened a bit. We are excited about a new initiative we began last year.”
That started with a meeting, “Biological Mechanisms of Psychosocial Effects on Disease,” that assembled a multidisciplinary group to discuss cutting edge research and begin discussion of the relevance of PNI to cancer control. Several other NIH Institutes and Centers were involved: the National Institute of Mental Health, the National Institute of Arthritis and Musculoskeletal and Skin Diseases, the Office of Behavioral and Social Sciences Research, and the Office of Cancer Complementary and Alternative Medicine.
A second meeting is likely later this year and a website is being developed that will host a bibliography of all relevant human and animal PNI research. “We are committed to including immunologists, psychologists, oncologists, and other relevant disciplines in these meetings to advance this science,” says Stefanek.
Translational psychological research is needed in other areas as well, he says. “As one example, the research of those working in the realm of decision-making as it applies to cancer prevention, screening and treatment needs to be better informed by researchers in the basic psychological processes of decision-making. There is far too little ‘cross talk’ between investigators in these two often discrete research universes, and this is a case where work in the applied world can also inform what questions are asked at the laboratory level to ultimately impact patients and health care providers making very difficult clinical decisions.” CCPS has begun efforts with other NCI divisions to foster work in this area, he says.
Part of the problem in trying to close the gap between psychological and biomedical scientists, he says, is that behavioral scientists have not always been “willing and able to provide convincing data to biomedical scientists and practitioners that our work makes a difference in the real world setting.”
Making that difference is Kerner’s target. His mission includes building the “dissemination” bridge, the one between intervention research and adopted practice. He was recruited to NCI in 2000 after 13 years at Memorial Sloan-Kettering Cancer Center and seven at the Lombardi Cancer Center at Georgetown University. At both campuses he had focused his cancer control and prevention research on under-served populations – in Harlem, South Bronx, Brooklyn and later, Washington, D.C. “He is a key person and national leader in this effort,” says Robert Croyle, acting director of CCPS.
“We can keep pouring money into translational research until we’re blue in the face,” Kerner says, “but if it doesn’t get us better outcomes, it’s not being adopted in the world outside of research.”
“In the behavioral domain,” Croyle says, “we find it helpful to distinguish between, one, diffusion and dissemination of research evidence, and two, research on the diffusion and dissemination process. We do more of the former than the latter, but are trying to grow the latter.”
TRIO plays a big role in advancing that growth. As Kerner describes it, “We go from the bench to the trench. How do we take our surveillance data from public health surveys, for example, and use them to identify populations that are specifically in need or that are conducting the highest risk factor behaviors? Then, how do we track progress? And how do we use that data to motivate action, by populations but also by service delivery systems that impact those populations?”
CONSIDERABLE BARRIERS
That trek from bench to trench often seems more like an obstacle course than a bridge, because the barriers along the way are considerable. For one thing, no one seems quite sure how to do the disseminating.
“Many efforts are not informed by research evidence about how best to do this,” says Croyle. “Yes, there are many efforts at continuing education and so on, but many are ineffective and the evidence base about how best to do this is insufficient. One of the main mistakes that psychologists make is that they underestimate the importance of studying and understanding the systems within which their research is applied, such as health care systems, school systems, health maintenance organizations, and public health systems.”
“We don’t have a lot of dissemination research funded at the National Institutes of Health,” agrees Kerner. For the most part, he says, health care deliverers simply are “taking a lot of evidence and making up how to get it ‘out there.’ We did an evidence review of all the dissemination research that had been done on tobacco, diet, breast and cervical cancer screening, and cancer pain management. The bottom line was we found more systematic reviews of intervention evidence than original reports of dissemination studies. When that’s the case, you know you have a problem.”
The review was done in preparation for the “Designing for Dissemination” conference last September that brought together researchers, health practitioners and intermediaries such as public and private funders and nonprofit policy organizations. According to the conference report, “There was insufficient evidence to recommend any of the dissemination approaches reviewed. Further, evidence suggested that certain dissemination approaches showed little or no effectiveness. Thus, while much is known about cancer control approaches that work, very little is known about how best to disseminate these approaches so they may be widely implemented.”
Another barrier is the compartmentalization of scientific endeavor. “Historically, so little behavioral science has been incorporated into biomedical teachings,” Stefanek points out. “Behavioral scientists must be willing and able to provide convincing data to biomedical scientists and practitioners that our work makes a difference in the real world setting.”
In part it’s a matter of making interdisciplinary training more widely available, he says. “There does seem to be a strong push for more interdisciplinary training, and NCI and other institutes are clearly recognizing that behavioral science can indeed contribute to survival, morbidity and quality of life outcomes, and that the behavioral and cognitive sciences can be real players in answering critical questions regarding patient care.”
Perhaps the most fundamental gap to be bridged is the historic one between basic and applied research. “Many researchers may not consider the potential applications of their work as something they need to be concerned with,” says Croyle. “In psychology, there continues to be a gap between the basic and the applied, but I think APS is making good efforts to address this. Our division of NCI sits right in the middle of the divide between the basic and the applied, so we’re also making a great effort to address this.”
A related problem is finding cross-over scientists, those with knowledge and skills that can move back and forth across the boundary between basic and applied research. “The number of people who conduct research that cuts across the boundary is small,” says Croyle. “We fund many public health researchers who do applied work that is not informed by current behavioral science theory and methods. Similarly, it is a struggle to encourage more psychologists who are strong in theory and methods to consider doing research that has greater relevance and applicability to public health problems.”
Overlaying these stumbling blocks is practitioners’ skepticism about the value of the research. Says Kerner, “If those in the practice community don’t believe the research has relevance, and they may or may not be right, just the fact that they don’t believe it means we have a problem.” That problem, he says, must be resolved by opening channels of communication and collaboration.
“We need to bring researchers and practitioners together more often and more intensively so that they can inform each other’s work,” Croyle says. “Clinical practice can and should be evidence-based, but research needs to address the relevant, high-priority questions that clinicians have.”
CONCEPT MAPPING
That is why NCI joined with the Center for the Advancement of Health and the Robert Wood Johnson Foundation to host last year’s conference. (See report excerpts on Page 11.) It produced an array of recommendations for action, but in the process the participants also discovered how different were the worlds in which they live.
They used “Concept Mapping” to display those differences graphically. In advance of the conference, those invited to attend were asked to identify “one thing that should be done to accelerate the adoption of cancer control research discoveries.” Their more than 200 answers were aggregated into 98 ideas, which were then grouped into 12 categories. Then they each ranked the ideas by importance and feasibility and the 12 categories were plotted on a graph – a dozen islands of different sizes and shapes.
The differences were stark. “Correlation between researchers and practitioners was .05,” says Kerner. “Almost zero!” That, at least, offered a starting point. “When you know what the issues are, you have a basis on which you can try to find common ground, because you’ve identified how they see the world differently.”
That is precisely what happened. As the conference report summarizes: “Researchers were the least likely to believe that translation and dissemination of research findings were their responsibility. … Similarly, practitioners … generally assigned responsibility for the synthesis and dissemination of research elsewhere … (and intermediaries) were adamant that researchers and practitioners must play important partnership roles in the process. After much discussion, participants agreed that responsibility for dissemination must be shared.”
“If you don’t know what your audience thinks and what it wants,” says Kerner, “making up dissemination approaches would be like trying to sell Coca-Cola without anybody ever doing taste testing. It’s amazing to me how much effort we put into putting things out there without asking these fundamental questions.”
The “taste testing” that NCI supports comes in many forms – usability tests, focus groups, to name but two – but one strategy that Kerner believes holds particular promise is “participatory research.”
“What you want is to get the ultimate users of the research, the ones who are familiar with the phenomena at the ground level, involved in helping to formulate the research questions, reviewing the design, interpreting the results, and helping figure out how to disseminate the findings. The reason that’s so important is, if they are at the table when the questions first come up, there will be a dialog back and forth between the scientist and the clinicians.”
Each step of the way “they are coming from different places,” he says. “Scientists will look at the results from previous studies, clinicians will look at the question from their own clinical experience. The dialog may identify not just statistically significant findings but clinically significant findings as well. And finally, who better to tell you how to get it out there as a product than a clinician who is committed to using it in their practice?”
The concept mapping that charted the differences among last year’s conference participants is yet another approach Kerner is building to get practitioner input into how best to produce research results that matter and have them adopted into practice. “We keep a growing list of receptor sites, the people out there who care about what we’re disseminating, so we can go back to them and ask what they want and how they think we’re doing. Concept mapping gives us input from hundreds of people.”
Think of it as a super-sized focus group. NCI sends letters and e-mails to its growing list, inviting recipients to participate in brainstorming an issue – either on a web site or, if they aren’t comfortable using the Internet, by fax. Then a core group sorts out the ideas into clusters which are then taken back to the larger group for assessment and rankings.
Still another barrier stands on the road to implementation, however: the “signal to noise ratio,” as Kerner puts it. “One of the problems we have in the system today is that every investigator – or, more accurately, the public relations department of the parent institution – wants to get its science into the news, so lots of individual studies are promoted as newsworthy.
“It’s one of the barriers to the dissemination of evidence-based interventions, because if you don’t synthesize across studies and instead shotgun the results of lots of individual studies, people get confused. They start asking: ‘What’s going on? What are you saying?’ The efforts to disseminate what is known across studies has to compete with the ‘noise’ of individual studies. When a synthesis report is disseminated, it may have 200 studies in it, but the individual studies, and the credit for them, get lost.”
Lack of funding for translational work is another obstacle, says Croyle. “One organization can’t do it alone.” That’s one reason collaborations with other organizations and funders are key, and why NCI is involved in such partnerships for a variety of projects, such as its Transdisciplinary Tobacco Use Research Centers (TTURCS), a joint effort of NCI, the National Institute on Drug Abuse (NIDA) and the Robert Wood Johnson Foundation. The two Institutes are funding development of new interventions to prevent people from starting to smoke and to help them quit if they do, while the Foundation is funding studies of how best to get the research findings adopted into public policy and practice.
Another way NCI addresses the funding gap is by adding set-asides for translational research to Requests For Applications, for example with TTURCS, the Centers of Excellence in Cancer Communication Research (now under review), and the Centers for Population Health and Health Disparities.
Kerner also uses “dissemination supplements” as incentives for scientists to focus at least some attention on the need to have their research findings adopted in real world settings. At any point during the final two years of an intervention research grant, an investigator who has enough efficacy data to present to a review committee and has a plan to disseminate the findings can apply for an additional year’s funding – up to $100,000 in direct costs – to test that plan.
Proposals are short and so is the turnaround time, he says, and even if the supplemental application doesn’t make the cut the first time around, the investigator can try again by attaching it to a follow-up study that, for example, extends the original study to a larger population. During FY 2002, the program’s first year, Kerner’s office funded seven of the 22 supplement applications it received.
This, he says, was “a first step to see if we could get investigators to take their research closer to application, to begin to do preliminary evaluation of the barriers and facilitators of adoption of their interventions. We want to build a case that there are scientists interested in this and there are valid questions to fund.” He also wants to “build a case for longer-term diffusion and dissemination studies.”
CANCER CONTROL – PLANET
The ultimate tool in Kerner’s kit, one he hopes will leapfrog the barriers and put research at practitioners’ fingertips, is an Internet portal now under development in collaboration with other agencies – the Centers for Disease Control and Prevention, the Agency for Healthcare Research and Quality, and the American Cancer Society. Called Cancer Control PLANET (for Plan Link at Network with Evidence-based Tools), when completed it will offer access to a network of web sites that “tie all the pieces of the puzzle” together for program staff, planners and researchers:
- State cancer profile information on both risk factors and cancer statistics.
- Regional and, eventually, community-based cancer control programs – “all the folks you’d love to partner with,” says Kerner.
- Systematic reviews of all the evidence on what interventions are effective.
- RTIPS (Research-Tested Intervention Programs), a Consumer Reports-like summary of intervention programs and their effectiveness, based on a rating system developed by Columbia University and the Substance Abuse and Mental Health Services Administration (SAMHSA).
- Comprehensive cancer control planning documents.
Kerner wants to post on RTIPs all evidence-based interventions so they can be searched and downloaded for real-world use. The interventions will be rated, he says, but authors will be allowed to review their ratings and, if they disagree, they can opt out of the listing. It’s his hope that the PLANET he is developing today will put the best evidence-based interventions on the desk of that health care provider in New Jersey in time to help patients tomorrow.
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.