A century ago, cervical cancer was a leading cause of death in the United States, killing 40,000 women a year. Then a test called the Pap smear made it easy to detect anomalous cells. Today, robust screening programs have greatly reduced cervical cancer deaths in this country.
The story sounds simple: a public-health triumph. But preventing cervical cancer is far from straightforward. It requires a chain of events made of links like "recommend test," "call patient with results," "make follow-up appointment," and so on. Just one broken link due to things such as unreliable bus schedules or clunky software can interrupt care.
Many good interventions ultimately fail to benefit public health. That's why Susan Dwight Bliss Professor of Biostatistics Donna Spiegelman, ScD, founded the Yale Center for Methods in Implementation and Prevention Science (CMIPS). Implementation science studies the barriers that keep evidence-based interventions from going mainstream, as well the factors that speed up that transition.
In fact, she adds, only 14% of biomedical research results are translated into practice--and that process takes an average of 17 years.
"There are very complex, multi-level reasons for why that happens," she said, “and addressing that shortfall is exactly what implementation science is all about."
When it comes to implementing research results in the real world, a host of factors on several levels can make a difference.
For example, providers may or may not have the time, training, and supplies to offer patients evidence-based care. Patients may or may not follow through on recommended care depending on costs, conflicting obligations, wait times, or cultural beliefs about certain diagnoses. Systems-level factors matter, too. Is there enough political will to fund health systems? Are there funds for large-scale prevention campaigns? Buses so patients can reach their appointments? Policies that let them take time off work?
Tackling one factor, or focusing only on one level, may not be enough to ensure an intervention takes effect. So, Spiegelman explains, implementation scientists develop multi-component strategies that are cost-effective.
Doing that is surprisingly complex, but the science is underway. CMIPS has made it a priority to develop new statistical methods and equip the field with stronger tools.
For example, suppose you have resources to improve cervical cancer outcomes in Mexico, where the chain is fragile and many women still die from the disease. Which interventions will work best? Should you offer nurses more training? Step up efforts to reach patients? Open more clinics? Early on, it's hard to say.
So CMIPS researchers pioneered a trial technique called the learn-as-you-go or LAGO design. This type of trial begins with a bundle of interventions. Then, at specified stages, researchers check which have been most effective. With this knowledge, they adapt the interventions, keeping what works and discarding what hasn't. They do this over and over as new trial data arrive, continuing to optimize the interventions.
Another CMIPS tool in development involves examining how public-health interventions affect people who are not their intended targets. For example, though a workplace fitness program may be aimed only at employees, a worker who brings home a new exercise habit might draw family members into fitness activities, too.
"The current paradigm for public health research doesn't typically estimate these spillover effects," Spiegelman said. "It only estimates the effect of interventions on those who are directly receiving them. Our emphasis on developing new methods is unique to CMIPS."
The idea, she explained, is to enable "better and more efficient studies, to learn more from the results of those studies, and to get things done faster. Then people can advance the science, whether it's in cancer, or increasing uptake of post-exposure prophylaxis for HIV, or increasing adherence to medication regimens or vaccination. This is about making the studies more powerful."
Spiegelman holds a rare joint doctorate in biostatistics and epidemiology. In 2014, she was the first biostatistician to receive the $5 million Pioneer Award from the NIH, given to highly productive scientists to help them tackle problems the NIH considers "high-risk, high-reward." She joined Yale and founded CMIPS in July 2018 after nearly three decades as a professor at the Harvard T.H. Chan School of Public Health.
CMIPS is devoted to educational efforts to prepare next-generation researchers in implementation science. It currently offers five federally funded postdoctoral fellowships and provides funds to train biostatistics PhD students, who can learn implementation science through a special pathway and a one-of-a-kind course in advanced methods in implementation and prevention science.
As the field continues to mature, Spiegelman hopes it will help stakeholders tackle the kinds of public-health challenges we both do and don't know how to solve--problems like cancer inequities, low vaccination rates, or high maternal mortality.
"We don't need new discoveries in obstetrics to get maternal mortality in low- and middle-income countries down to the levels that we have in high-income societies," she said. "It's all about implementation--addressing the barriers, leveraging the facilitators."
Featured in this article
- Donna Spiegelman, ScDSusan Dwight Bliss Professor of Biostatistics; Affiliated Faculty, Yale Institute for Global Health; Director, Center for Methods in Implementation and Prevention Science (CMIPS); Director, Interdisciplinary Research Methods Core, Center for Interdisciplinary Research on AIDS; Assistant Cancer Center Director, Global Oncology, Yale Cancer Center