Federal Agents of Change: Behavioral Insights Power Evidence-Based Efforts to Improve Government
Antipsychotics such as quetiapine are often prescribed for reasons not supported by clinical evidence, inflating health care costs and potentially exposing patients to harm. Inappropriate quetiapine prescriptions are especially problematic in nursing homes and other residential facilities for older adults, costing Medicare Part D plans some $158 million in 2015 alone, according to the U.S. Government Accountability Office. Could an intervention targeting high prescribers reduce overprescribing and improve quality of care?
It did. In 2018, a partnership among the Office of Evaluation Sciences (OES), the Center for Program Integrity (CPI) at the Centers for Medicare and Medicaid Services, and academic researchers resulted in an 11.1 percent decrease in the average days of quetiapine supplied by the top 5% of general-care prescribers. There was no corresponding evidence of patient harm. A simple series of “peer-comparison” letters from CPI, randomly assigned along with a control letter, motivated the decrease. The letters indicated exactly how much higher the providers’ quetiapine prescribing was relative to their within-state peers, and noted that they had been flagged for review.
It’s projects like this—low-cost, potentially high-impact collaborations based on large-scale data sets and tested under real-world conditions—that are the hallmarks of OES. Based at the U.S. General Services Administration, this small team of interdisciplinary experts (mostly from social sciences, and many trained as psychologists) works across government to apply behavioral insights, make concrete recommendations for how to improve government, and evaluate impact using administrative data. In a free APS webinar on July 29th, three such experts outlined how OES works and how psychological scientists can help apply research insights to government programs and policies.
- OES collaborations, evaluation policy, templates, and more: oes.gsa.gov
- APS’s Federal Research, Funding, and Policy initiatives: psychologicalscience.org/policy
Rigor and Transparency
OES’s broad portfolio reflects its mission to “deliver a better government for the public by enabling agencies to build and use evidence to continually learn what works.” Since 2015, it has completed more than 70 randomized evaluations that have led to outcomes including higher enrollment in retirement savings plans, lower costs for government operations, widened educational opportunities, and, as demonstrated by the reduction in quetiapine prescriptions, improved public health outcomes.
“We generally work at the touchpoint between individuals and government programs,” said Russ Burnett, a cognitive psychologist and OES methods lead. “We try to reduce friction at that touchpoint by applying what research has revealed about how people make decisions and the things that influence how they act.”
While there’s no typical OES project, all adhere to the five core principles of OES evaluation policy:
- Rigor: OES designs evaluations to generate the strongest possible evidence to answer priority questions and support decisions. Every analysis also undergoes an internal replication process to ensure the work is reproducible. The vetting process covers areas including strength of intervention design, size of target population and experimental sample, statistical power, and anticipated limitations.
- Relevance: OES chooses projects based on policy priorities and potential impact.
- Transparency: OES publicly commits to its analysis plans before working with data, where possible specifying variables such as statistical models and tests, cases to be excluded from analysis, and how missing data will be handled. It posts a detailed analysis plan on its website before working with the data, and publishes all results, whether favorable, unfavorable, or neutral.
- Independence: OES retains control over project selection and the publication of results.
- Ethical practice: OES safeguards the dignity, rights, safety, and privacy of participants and stakeholders in evaluations. Evaluations comply with the spirit and letter of regulations and other relevant requirements.
The statistical power inherent in large data clusters brings many advantages in OES’s randomized trials. “We’re often drawing from an agency’s existing population—for example, existing residents of public housing, VA facilities, or potential beneficiaries,” explained Rebecca Johnson, an OES academic affiliate and assistant professor in quantitative social science at Dartmouth College. “So we often know a fair amount about their demographics and/or past behavior.”
There are challenges too. “Compared to lab experiments, our data sets often look messy because we rely on administrative data,” said Burnett. “There are more possible outcomes, more possible covariates to include in our analyses.”
The webinar concluded with a look at recent federal legislation creating demand for the skills of behavioral scientists in the federal government. The Open Government Data Act expands access to and facilitates use of federal data assets for evidence-building, said Michael Hand, an economist and OES evidence lead. The Foundations for Evidence-Based Policymaking Act (the “Evidence Act”) elevates policy-relevant research, such as learning agendas, evaluation plans, and proof of evidence-building capacity leverage.
Collectively, this means that “agencies can bring in more researchers with deep subject matter expertise” to help develop studies and prioritize data needs, for example. Specifically at OES, opportunities include a one-year fellowship program, with individuals “on loan” from other institutions, and subject-matter specific academic affiliates. Learn about fellowships and other opportunities at oes.gsa.gov/opps; those interested in becoming academic affiliates are invited to share their interests by emailing [email protected].
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.