Skip to Main Content

Addressing Heterogeneity and Adaptability in Multi-Component Implementation and HIV/AIDS Interventions: Emerging Frameworks for Research on Complex Health Interventions

February 14, 2024
  • 00:00<v ->For joining our CMIPS seminar.</v>
  • 00:03It's really a pleasure to have you all
  • 00:05and most importantly,
  • 00:06a pleasure to have Dr. Brian Mittman,
  • 00:09who we've been talking about bringing over here
  • 00:12to CMIPS and Yale for quite some time.
  • 00:16Dr. Mittman is a distinguished
  • 00:19longstanding implementation scientist,
  • 00:22I might even say one of the founders
  • 00:25of implementation science as a formal discipline.
  • 00:30He is a research scientist
  • 00:33in the Department of Research and Evaluation
  • 00:36with additional affiliations
  • 00:37at the US Department of Veteran Affairs,
  • 00:40which is another place where a huge amount
  • 00:44of some of the best implementation science,
  • 00:47thinking and research has emanated over the years,
  • 00:50the University of Southern California
  • 00:53and the University of California Los Angeles,
  • 00:56where he co-leads
  • 00:56the UCLA Clinical and Translational Science Institute
  • 01:01Implementation and Improvement Science Initiative.
  • 01:05And I find that very interesting
  • 01:07in that the implementation and improvement science
  • 01:10are linked in the same name,
  • 01:11which is also something we at CMIPS are very interested
  • 01:15in the kind of continuum between implementation science
  • 01:20and improvement science, quality improvement,
  • 01:23and what are the commonalities and differences
  • 01:25and where does one end and the other begin.
  • 01:28So I don't know if that's something
  • 01:29Dr. Mittman is gonna touch upon in his talk today.
  • 01:33He chaired the planning committee
  • 01:35that launched the journal, "Implementation Science,"
  • 01:38which now has a sort of spinoff journal,
  • 01:40I forget its name, but now there's two of them,
  • 01:43and served as co-editor in chief of that journal
  • 01:46from 2005 to 2012.
  • 01:50He was a founding member
  • 01:51of the US Institute of Medicine Forum
  • 01:53on the science of quality improvement and implementation
  • 01:58and chair at the National Institutes of Health
  • 02:00Special Emphasis Panel
  • 02:03on Dissemination and Implementation Research in Health
  • 02:05in 2007 and 2010.
  • 02:08And for those of us in the audience
  • 02:11who are thinking about NIH grants,
  • 02:14what I've very recently learned is now that
  • 02:17the DNI panel, as we call it,
  • 02:20has been renamed.
  • 02:21Maybe Dr. Mittman knows the name,
  • 02:23I don't remember the name,
  • 02:25and maybe even there's multiple ones of it now.
  • 02:29But if you're interested,
  • 02:30maybe write to me later and we can figure that out
  • 02:32because it's very important
  • 02:34for our implementation dissemination science applications
  • 02:38to NIH here at Yale.
  • 02:40Dr. Mittman directed
  • 02:41the VA's Quality Enhancement Research Initiative
  • 02:44from 2002 to 2004.
  • 02:47His research examines innovative approaches
  • 02:50to healthcare delivery and improvement
  • 02:53and efforts to strengthen learning healthcare systems,
  • 02:56another area in which we're very interested in CMIPS
  • 02:59and many others at Yale are as well.
  • 03:02So today, Dr. Mittman is gonna talk to us
  • 03:05about Addressing Heterogeneity and Adaptability
  • 03:08and Multi-Component Implementation
  • 03:10and HIV Interventions:
  • 03:12Emerging Frameworks for Research
  • 03:15on Complex Health Interventions.
  • 03:16And actually, I wanted to say one thing
  • 03:19before I turn it over to him.
  • 03:20He also serves as a consultant
  • 03:24for our R3EDI Hub,
  • 03:25which is a technical support hub
  • 03:29that supports seven projects
  • 03:31devoted to ending the AIDS epidemic
  • 03:34under a general coordinating center based in Illinois.
  • 03:38And it's been a pleasure to have Brian
  • 03:40as a part of our R3EDI Hub team as well.
  • 03:43So without any further ado now,
  • 03:45I will turn things over to Dr. Mittman.
  • 03:48<v ->Great, thank you, Donna,</v>
  • 03:50both for the kind introduction
  • 03:51as well as more importantly the opportunity to present
  • 03:55and to meet with some of your colleagues today and tomorrow.
  • 03:59As we were saying before we started,
  • 04:02my hope is to have the opportunity
  • 04:04to join you in person at some point down the line,
  • 04:07but I know we all share that hope
  • 04:08for lots of in-person gatherings.
  • 04:11You touched on several of my favorite topics,
  • 04:13including implementation science, improvement science.
  • 04:17I'll mention that I think very briefly
  • 04:19as well as other topics.
  • 04:21And I'm glad to schedule follow-up talks
  • 04:24to speak about them.
  • 04:26One quick comment on some of your kind remarks.
  • 04:30I always counsel junior colleagues
  • 04:32to pick a very small field that's likely to grow
  • 04:36and get in on the ground floor
  • 04:37because it makes you look important.
  • 04:39That's sort of the big fish, small pond kind of idea.
  • 04:43But also, the fact that I spend much more of my time
  • 04:46advocating and helping to develop and expand
  • 04:50fields that I'm interested in sometimes
  • 04:54rather than actually doing the research,
  • 04:55although I do have a research portfolio.
  • 04:58So implementation science is a field
  • 05:00that I was able to, again, get in on the ground floor
  • 05:05and help to wave the flag,
  • 05:07promote interest and advocate at NIH,
  • 05:10at PCORI and many other places.
  • 05:13And used to spend a lot of time
  • 05:15on the freeways in Los Angeles
  • 05:17traveling between different facilities and institutions
  • 05:20as well as in planes trying to,
  • 05:22again, advocate and promote interest
  • 05:24in implementation science.
  • 05:26But the implementation science field,
  • 05:28in my view, is well established.
  • 05:30There are many of us who are interested
  • 05:33and have active research portfolios.
  • 05:35I don't know that I have that much to offer at this point
  • 05:37as far as new ideas,
  • 05:39but I have a different view about the field
  • 05:41of complex health interventions
  • 05:43where I think there is a need to continue to think hard
  • 05:47and promote some of the newer emerging frameworks
  • 05:50and point out that as researchers are tasked
  • 05:54in studying complex health interventions
  • 05:56is a bit different from our tasks
  • 05:58and studying other kinds of interventions.
  • 06:03I think my most important focus lately in my research
  • 06:06is trying to help, again, advocate
  • 06:09and share information on some of these emerging frameworks
  • 06:12and encourage more development.
  • 06:15One more comment in terms of the truth in advertising.
  • 06:18I actually won't spend much time at all
  • 06:21talking about specific implementation science
  • 06:23or HIV/AIDS intervention examples,
  • 06:26but I think it'll be very clear
  • 06:28as to how and why the comments that I will make
  • 06:31are directly relevant to both of those bodies of activity.
  • 06:35So let me move on
  • 06:37and I try to remember which button allows me to advance.
  • 06:41So let me start with a very high-level question,
  • 06:44and that is ask us all to think a little bit
  • 06:47about what we as researchers do
  • 06:50in addition to producing scientific generalizable knowledge,
  • 06:54what we do to support policy decision makers
  • 06:57and practice decision makers questions.
  • 06:59And much of the research that's conducted in medical schools
  • 07:03and on other health-related institutions
  • 07:05pursues these questions.
  • 07:07Does it work or is it effective?
  • 07:09The FDA of course would like to know
  • 07:11if a new drug should be approved.
  • 07:13CMS and others would like to know
  • 07:14if it should be funded and promoted.
  • 07:17Should it even be mandated?
  • 07:19Within health systems,
  • 07:20P&amp;T committees have decisions to make
  • 07:22about inclusion of new drugs in a formulary.
  • 07:26And frontline practicing clinicians need to know
  • 07:29whether they should use a new drug
  • 07:31or another intervention.
  • 07:33So much of the questions here
  • 07:36in the guidance that we endeavor to provide
  • 07:40to our policy and practice decision maker colleagues
  • 07:43is a set of answers to these questions.
  • 07:46Does it work?
  • 07:47Is it effective?
  • 07:48Or in the case of comparative effectiveness research,
  • 07:51is intervention A better than B?
  • 07:54And of course, we focus on outcomes and impacts
  • 07:57when we try to answer this yes/no question.
  • 08:01We often have the sample size and the funding
  • 08:03and the ability to examine heterogeneity
  • 08:06and subgroup effects and so on,
  • 08:09and whether contextual factors
  • 08:10influence the effects and outcomes.
  • 08:13And of course, our gold standard research method of RCTs
  • 08:18and similar experimental methods
  • 08:20where we randomize and measure outcome differences,
  • 08:22that's how we go about conducting this research.
  • 08:26But again, the focus is on impact and outcomes.
  • 08:29Are the outcomes better for those in the intervention group
  • 08:33versus the control group?
  • 08:34And if the answer is yes,
  • 08:36then the intervention is effective,
  • 08:38it's approved by the FDA,
  • 08:39it's promoted, it's reimbursed and it's used.
  • 08:44And there are in fact many examples of magic bullets
  • 08:48or very strong robustly effective drugs
  • 08:51for which we can produce a clear answer to that question.
  • 08:54Yes, this drug is very effective.
  • 08:57Precision medicine, of course,
  • 08:58is leading us down the path for drugs and interventions
  • 09:02for which there isn't a clear answer
  • 09:04where there are high levels of heterogeneity.
  • 09:07And we do need to tailor the interventions
  • 09:11and that's really the theme of this talk.
  • 09:14When we think about complex interventions
  • 09:16or complex health interventions,
  • 09:18and I'll define them more formally in a minute,
  • 09:20but health promotion programs, HIV/AIDS prevention,
  • 09:25treatment programs, implementation strategies,
  • 09:29there are some examples of highly robust,
  • 09:32highly effective complex health interventions
  • 09:34for which we can produce a strong answer.
  • 09:38Yes, this intervention tends to be effective
  • 09:41across multiple settings
  • 09:43and in multiple sets of circumstances.
  • 09:46But by and large, for most complex health interventions,
  • 09:49when we ask the question, is it effective?
  • 09:51The answer that comes out of our research
  • 09:53is sometimes or it depends.
  • 09:55The heterogeneity,
  • 09:57I'll sometimes use the term extreme heterogeneity,
  • 09:59is just so great that the impacts vary considerably
  • 10:05and it's impossible to produce a simple answer,
  • 10:07yes or no, it is effective or it's not effective.
  • 10:11So there is no formal established definition
  • 10:14of complex health interventions at this point,
  • 10:17but here are some of the key features
  • 10:19that tend to be mentioned in most discussions
  • 10:22of complex health interventions.
  • 10:24The fact that there are multiple components
  • 10:26and those components interact.
  • 10:29The intervention, the multi-component intervention
  • 10:32tends to target multiple levels,
  • 10:35not always, but certainly multiple entities.
  • 10:37So we have interventions that target patients
  • 10:39and family caregivers and other peers
  • 10:43as well as clinicians and other health system staff,
  • 10:47as well as in many cases,
  • 10:48communities and even regulatory levels.
  • 10:52These interventions tend to be highly adaptable.
  • 10:56They're not fixed.
  • 10:57So unlike a drug that comes from the factory
  • 11:00in a very consistent chemical formulation
  • 11:03with a high degree of consistency and homogeneity,
  • 11:07these interventions adapt.
  • 11:09And that's the case even when we try to achieve fidelity
  • 11:12to the manualized intervention
  • 11:14and prevent adaptations and modifications,
  • 11:18and that's another theme I'll come back to.
  • 11:21Because of all of these features,
  • 11:22the interventions tend to achieve their effects
  • 11:25through multiple pathways
  • 11:27and they tend to be mediated.
  • 11:29So it's not a drug that has a direct impact
  • 11:31on a physiologic process,
  • 11:33but instead an intervention
  • 11:35that changes attitudes or beliefs,
  • 11:38those changes in attitudes and beliefs
  • 11:40lead to changes in knowledge and intentions.
  • 11:43Those changes in knowledge and intentions
  • 11:45eventually lead to changes in behavior.
  • 11:47But those behaviors are influenced by multiple factors.
  • 11:50So it's not only the patient's own beliefs
  • 11:53and knowledge and attitudes,
  • 11:54but peer influence, clinician influence,
  • 11:58social influence from key opinion leaders
  • 12:00and a number of others.
  • 12:01So the causal pathways tend to be very complex
  • 12:05and I'll illustrate that in a few minutes.
  • 12:07So when we think about a comparison
  • 12:09between simple interventions like drugs
  • 12:12versus complex interventions,
  • 12:14these are some of the key dimensions
  • 12:16where there are differences.
  • 12:18The difference between a single fixed
  • 12:21and highly stable and homogeneous drug
  • 12:24that targets a single stable physiologic process
  • 12:28to achieve a simple goal such as reducing blood pressure
  • 12:32in patients that are not always homogeneous.
  • 12:36There are differences,
  • 12:37but the argument is that patients,
  • 12:40despite genetic profile differences
  • 12:42and other physiologic,
  • 12:44as well as clearly socioeconomic status
  • 12:47in neighborhood and contextual differences,
  • 12:52those differences tend to be somewhat smaller
  • 12:54than the differences we see
  • 12:55across communities and organizations.
  • 12:59And again, we can argue that point,
  • 13:00but these are the key distinctions
  • 13:03between these two categories of interventions.
  • 13:06And the consequences, of course,
  • 13:07or the implications for research are that,
  • 13:11when we study drugs, oftentimes,
  • 13:13not always, but oftentimes we do see
  • 13:15a relatively high level of homogeneity
  • 13:18with very consistent and often strong,
  • 13:21easily detected main effects.
  • 13:23Whereas again, with complex interventions,
  • 13:25we get the answer along the lines
  • 13:27of it depends or sometimes.
  • 13:29We see lots of complexity, instability and heterogeneity.
  • 13:33And the average effects,
  • 13:34because of the heterogeneity, tend to be very weak.
  • 13:38We have many subjects or targets in the intervention
  • 13:40that do very well, others that do very poorly,
  • 13:43but on average, an average effect size estimates
  • 13:46that are close to zero.
  • 13:48One key point, and that is,
  • 13:50this is not a dichotomy, but instead of a continuum.
  • 13:53There are elements of complexity in all interventions.
  • 13:56The key question is,
  • 13:58when is an intervention sufficiently complex
  • 14:01that we can't study it through an RCT
  • 14:04with a focus on average effect sizes,
  • 14:07but instead need to use the more complex kinds of approaches
  • 14:10that I'll talk about over the next several minutes.
  • 14:14So getting back to this question,
  • 14:16does it work, is it effective?
  • 14:17And the answer being sometimes or it depends,
  • 14:20that answer, of course,
  • 14:21is not at all useful for decision makers.
  • 14:23So we need to think about a different way
  • 14:25of designing, conducting our studies
  • 14:28and a different type of evidence
  • 14:30or a set of insights and findings
  • 14:33that we need to produce for science,
  • 14:36but also for policy and practice.
  • 14:38So let me back up and illustrate
  • 14:40some of the challenges that we face
  • 14:42when we deal with complex health interventions.
  • 14:44So this is a pattern of results from a hypothetical study
  • 14:47that could be a guideline implementation study.
  • 14:50We are attempting to improve adherence
  • 14:53to evidence-based clinical practice guidelines.
  • 14:56In the blue sample,
  • 15:00the blue bars in this histogram
  • 15:02shows that all of the sites
  • 15:05in the intervention group did very well.
  • 15:08Our intervention managed
  • 15:09to significantly improve rates of adherence
  • 15:12among all the intervention physicians
  • 15:14or clinics or hospitals,
  • 15:16whereas the sites in the yellow or light green
  • 15:19are all scattered around zero.
  • 15:23So on average, we saw no change in adherence levels
  • 15:26among the usual care comparison sites,
  • 15:30although some of course did better and some did worse.
  • 15:32It's just because of random variation.
  • 15:35I don't know that we've ever seen findings
  • 15:37from any implementation study
  • 15:39that resembled this kind of pattern
  • 15:42or anything close to it.
  • 15:44This clearly would be "New England Journal"
  • 15:46or "Lancet" caliber work
  • 15:49if we had a strong finding of this sort,
  • 15:51but that's what we would hope to see with our interventions,
  • 15:54that we would find or design an intervention
  • 15:57and see very robust, very significant effects.
  • 16:01This is what we tend to see more often
  • 16:03when we study complex health interventions.
  • 16:05There's almost complete overlap
  • 16:07between the blue and the light green yellow sites.
  • 16:11If you are an intervention site,
  • 16:13you are almost as likely to show
  • 16:16reduced rates of adherence
  • 16:18as you are increases.
  • 16:19And similarly, the usual care sites,
  • 16:22many of them did show rates of improvement
  • 16:24that are comparable to those in the intervention site.
  • 16:28So when you have a pattern of results like this,
  • 16:32you can't say to decision makers,
  • 16:34my complex health intervention,
  • 16:36my HIV/AIDS prevention program
  • 16:38or my implementation strategy
  • 16:40or quality improvement program is highly effective,
  • 16:43I would advise you to use it.
  • 16:46As a decision maker,
  • 16:47if I know I'm almost likely
  • 16:49to end up spending a lot of money
  • 16:50and staff time and disruption
  • 16:54and end up with decreased performance,
  • 16:56obviously, I'm not going to be interested in this program.
  • 16:59So what is our goal then as a researcher?
  • 17:02Our goal, of course, is to understand
  • 17:03who ended up on the right hand side of this distribution,
  • 17:06what the factors were that led to those improvements
  • 17:09for both intervention as well as control sites,
  • 17:11and what can we do to counsel decision makers
  • 17:14to allow them to end up on the right hand side
  • 17:18rather than the left hand side of the distribution.
  • 17:22So when we think about finding
  • 17:25or designing developing complex health interventions
  • 17:28that are effective,
  • 17:30one position that we could take
  • 17:31is our goal as researchers is to develop
  • 17:35and generate the evidence showing
  • 17:37that our interventions are highly effective,
  • 17:41but that assumes that those interventions exist.
  • 17:44Hang on one second, I will be right back.
  • 17:53<v ->While he's out,</v>
  • 17:54I can say if people have questions or comments,
  • 17:57why don't you put them in the chat as we go along
  • 18:00and then at the end of Brian's talk,
  • 18:03I'll pose some of the questions and comments to him.
  • 18:06Go ahead, Brian. <v ->Thank you.</v>
  • 18:08Yeah, so my apologies.
  • 18:09For those who joined earlier,
  • 18:10we were talking about the renovations underway.
  • 18:14My wife was stuck outside
  • 18:15because I forgot to open the door
  • 18:17for the second pathway into the kitchen
  • 18:20because the main path is covered with paint paraphernalia.
  • 18:25So again, because complex health interventions
  • 18:29tend not to be robust
  • 18:30and we tend not to have the ability
  • 18:32to find or develop the needle in the haystack
  • 18:37or they don't exist at all,
  • 18:39that a robust complex health intervention
  • 18:41is a mythical beast,
  • 18:44we need to take a different strategy
  • 18:46and a different approach
  • 18:47in designing and conducting research
  • 18:50and supporting health decision makers.
  • 18:54So rather than pursuing questions
  • 18:56such as is it effective or does it work
  • 18:58or which is more effective,
  • 19:00we need to be thinking about deriving
  • 19:02and developing insights and guidance for practice,
  • 19:06such as how does it work, why does it work?
  • 19:09Where, when and for whom,
  • 19:10the realistic evaluation key questions,
  • 19:13but also how can we enhance its effectiveness?
  • 19:15Which again gets back to this issue of adaptability.
  • 19:19We'd have very few degrees of freedom
  • 19:21to enhance the effectiveness of a drug.
  • 19:24We can obviously titrate the dose
  • 19:26and we can prescribe supportive interventions,
  • 19:30but we can't modify the chemical formulation of the drug.
  • 19:33We can modify the so-called chemical formulation
  • 19:37of a complex health intervention.
  • 19:38So our goal and our task as researchers
  • 19:41is to guide that tailoring and that adaptation.
  • 19:45So we should strive to support decision makers
  • 19:50as they try to answer these questions.
  • 19:52How do I choose an appropriate complex health intervention?
  • 19:56How do I implement or deploy that program
  • 19:58and tailor it to increase its effectiveness?
  • 20:02But also how do I modify or manage the organization?
  • 20:05Oftentimes just as we can improve health outcomes
  • 20:09by changing diet and exercise
  • 20:12and changing the social surroundings of our patients,
  • 20:16we can certainly improve outcomes
  • 20:18for complex health interventions
  • 20:20by modifying the organization.
  • 20:22So again, another task for researchers.
  • 20:25But back to the key questions.
  • 20:27We need to understand and develop insights
  • 20:30and provide guidance regarding how, when, why
  • 20:33and where do these interventions work
  • 20:35and how can we modify them to make them work.
  • 20:39So the focus here instead of on impact in simply asking,
  • 20:44does intervention A produce
  • 20:46a greater impact or outcome than intervention B,
  • 20:49we need to instead focus on the black box.
  • 20:52We need to understand the mediators and the moderators,
  • 20:55the mechanisms of effect.
  • 20:57We need to explicitly study adaptation
  • 21:00and we need to study context
  • 21:02and how to manage context.
  • 21:04So again, another point related to the key theme
  • 21:07of different types of research,
  • 21:10not a focus on measuring impact,
  • 21:12but instead to focus on understanding and studying process.
  • 21:18So again, rather than thinking about evidence-based practice
  • 21:22and generating or producing an estimate of effect sizes,
  • 21:27to me, research on complex health interventions
  • 21:30should focus on deriving
  • 21:32or developing insights and guidance.
  • 21:34So it's insights and guidance rather than evidence
  • 21:38in the way that we typically think of evidence.
  • 21:42So, getting back to the features
  • 21:44or the characteristics of complex health interventions
  • 21:47and why they tend to have such weak average effect sizes
  • 21:51and such extreme heterogeneity,
  • 21:53we know as I argued that,
  • 21:55or we believe or I would assert
  • 21:57that the intervention targets and settings
  • 21:59are much more heterogeneous.
  • 22:01Communities differ, individuals differ,
  • 22:04and the same behavioral approach that we use
  • 22:07or the same implementation strategy for one hospital
  • 22:11is not likely to be effective
  • 22:13or to work in the same way as in another hospital.
  • 22:16Differences in hospital leadership and culture
  • 22:20and staffing patterns and resources and so on
  • 22:23all mediate and moderate the effects of the intervention.
  • 22:27If we think about health psychology
  • 22:29and patient behavior change,
  • 22:31and one of the topics that we're studying
  • 22:33in Kaiser Southern California, which is HPV vaccination,
  • 22:36we know that clinician brief interventions
  • 22:38are likely to be effective for some patients and parents
  • 22:42who retain respect for their physicians
  • 22:44and will follow their advice.
  • 22:47But for other patients,
  • 22:48that physician brief intervention
  • 22:50can in fact be counterproductive
  • 22:52because it reinforces a patient's
  • 22:53or parent's op priori belief
  • 22:55that these vaccines are poison
  • 22:58and my physician is sort of an agent
  • 23:01of the drug company trying to enhance profits.
  • 23:05So again, lots of heterogeneity
  • 23:07in the targets in the settings.
  • 23:09We also know that the underlying pathologies,
  • 23:12their etiology, their root causes differ.
  • 23:14And again, the vaccine example is a good one.
  • 23:17When we're dealing with low vaccination rates
  • 23:20in a set of clinics or hospitals
  • 23:23where patients tend to be respectful
  • 23:26and responsive to brief interventions,
  • 23:29we can suspect that the reason for low rates of adherence
  • 23:33don't relate to patient resistance,
  • 23:35but instead, physicians and staff or the systems
  • 23:38not necessarily optimizing their activities.
  • 23:42Whereas in other parts of Kaiser Southern California,
  • 23:46we know that the hospitals and the clinics
  • 23:48and the organizational policies
  • 23:50and the clinicians are doing everything in their power
  • 23:53to improve vaccination rates.
  • 23:55The reason for low vaccination rates
  • 23:57is patient and parent resistance
  • 23:59that is tied to their own beliefs.
  • 24:02So understanding differences
  • 24:04in the root causes of low adherence rates
  • 24:09or quality or outcomes
  • 24:11or poor patient behavior
  • 24:13and recognizing the heterogeneity,
  • 24:15again, is important.
  • 24:16And that's one of the reasons
  • 24:18for the highly variable effects of interventions
  • 24:22because they sometimes address
  • 24:24the root causes and solve the problem,
  • 24:25but other times the same intervention does not.
  • 24:28And then finally, as I've said,
  • 24:30the interventions themselves tend to be highly variable
  • 24:33and irrespective of our efforts
  • 24:35to achieve adherence to a manualized intervention
  • 24:39and achieve high rates of fidelity,
  • 24:41we know that we won't always see that intervention
  • 24:45be delivered the same way across sites.
  • 24:48There's drift over time,
  • 24:49there are local adaptations,
  • 24:51but again, more importantly,
  • 24:52we shouldn't try to achieve fidelity
  • 24:55because one version of intervention
  • 24:57that does match local circumstances in one setting
  • 25:00is not likely to be effective
  • 25:02and match local circumstances elsewhere.
  • 25:04So the adaptability of interventions,
  • 25:08their heterogeneity across place,
  • 25:11but also across time is a challenge.
  • 25:13But we should view it as a strength
  • 25:15that we need to embrace and use to our advantage.
  • 25:19So some of you who are in my generation or have kids
  • 25:22because I believe this game is still sold,
  • 25:24will recognize the image in the upper right hand corner.
  • 25:28And this is the way that I often think
  • 25:29about complex health interventions,
  • 25:31that if we were to watch the very beginning
  • 25:34of the mouse trap contraption
  • 25:36where we drop the marble
  • 25:38and then focus only on the very end
  • 25:41and whether the trap falls or not,
  • 25:44sometimes it will, sometimes it won't.
  • 25:46But that set of empirical observations
  • 25:49doesn't help us at all
  • 25:50in improving the performance of this mouse trap.
  • 25:54We need to follow every step in the causal chain
  • 25:57and understand which part of the contraption
  • 25:59was not built correctly
  • 26:01or where things are going wrong.
  • 26:03So again, the question is not,
  • 26:06is it effective, but how does it work?
  • 26:08And we need to shine our spotlight,
  • 26:11our flashlight and our research attention
  • 26:13in terms of data collection analysis
  • 26:16on the mechanisms of effect.
  • 26:18As I've said, we need to,
  • 26:21rather than try to ignore adaptations or suppress them,
  • 26:25we need to embrace them.
  • 26:26We need to study and guide those adaptations.
  • 26:30The concept of a manualized intervention,
  • 26:32I think for a complex health intervention
  • 26:34requires rethinking.
  • 26:35My favorite example here is a story
  • 26:37that I believe is accurate
  • 26:39of one of the sites
  • 26:42in one of the patient self-management studies
  • 26:47where the patient self-management program
  • 26:51had a highly detailed manualized intervention,
  • 26:55including a very clear script
  • 26:58for the leader of a patient self-management education group
  • 27:02to use in educating members of the group.
  • 27:06And the story is that members of the study team
  • 27:09were observing a leader
  • 27:11deliver the patient self-management program
  • 27:14in an African American church in Baltimore.
  • 27:16And the leader of that program was not following the script.
  • 27:21She was making up the comments
  • 27:25and the educational content as she went along
  • 27:27and the research assistants who were observing
  • 27:30came up to her afterwards
  • 27:31and congratulated her on a successful session,
  • 27:34but said,
  • 27:35"I noticed that you were deviating from the script.
  • 27:38Why is that?
  • 27:39Don't you know that this is an evidence-based intervention?
  • 27:41And if you follow the manual
  • 27:43and follow the script to the letter,
  • 27:45you're guaranteed to see positive outcomes,
  • 27:47but if you deviate from it,
  • 27:49we don't know what sort of outcomes you will observe."
  • 27:52And the leader of the church group said,
  • 27:54"Well, as you know, your manual and your script
  • 27:57was written in Stanford English.
  • 27:59We don't speak Stanford English here.
  • 28:01So I was using language and concepts
  • 28:04and ideas and examples
  • 28:05that I felt were more suitable for my local circumstances.
  • 28:09So that's a somewhat extreme example, of course,
  • 28:12but it does point out that a manualized intervention
  • 28:15typically was developed from a study
  • 28:18at a specific point in time in a specific region,
  • 28:21in a specific set of settings.
  • 28:23And the details of that intervention
  • 28:26might in fact be highly optimal for that particular setting,
  • 28:31but are not likely to be feasible
  • 28:34and certainly not optimal for other settings.
  • 28:37So again, we have to rethink the concept
  • 28:39of manualized interventions.
  • 28:40Similarly, we have to rethink the concept of core components
  • 28:44and the term, core components,
  • 28:45that concept is used relatively broadly,
  • 28:48but oftentimes it talks about the intervention activities,
  • 28:52the scripts, the tools, the protocols, the procedures.
  • 28:56And again, those tend to be highly idiosyncratic
  • 28:59and often optimized and developed
  • 29:02by and for a specific set of settings
  • 29:07in target audiences.
  • 29:08So the alternative to the concept of core components
  • 29:13and that way of thinking about complex health interventions
  • 29:16is to specify a set of core functions in a menu of forms.
  • 29:20And I'll talk through that in a few minutes.
  • 29:23But let me just briefly point out
  • 29:25that in the implementation field,
  • 29:28and again using guideline adherence as an example,
  • 29:31as I said, we often have very complex,
  • 29:33multi-path, mediated
  • 29:36and highly moderated sorts of causal pathways.
  • 29:40And a typical multi-component guideline adherence program
  • 29:45targeting physicians has to worry
  • 29:47about the physician's attitudes and norms
  • 29:49and try to address them
  • 29:50as well as their knowledge and skill,
  • 29:52as well as their motivation of their activation.
  • 29:55And many of these are influenced
  • 29:59by, again, multiple mediated pathways,
  • 30:03but also some of those causal effects are highly moderated.
  • 30:06We know that a financial incentive to follow the guideline
  • 30:11that consists of a $20,000 bonus
  • 30:14is likely to be highly effective
  • 30:16for a junior family physician
  • 30:19with an income in the $150,000 range.
  • 30:22But for the senior surgeon
  • 30:25with a multimillion dollar income
  • 30:28who knows how to practice and doesn't need the guidelines,
  • 30:33that bonus is not likely to have much effect.
  • 30:35So again, highly heterogeneous effects
  • 30:39in complex causal pathways that we need to understand.
  • 30:44So let me again, as an aside,
  • 30:46briefly present the PCORI method standards
  • 30:49for complex health interventions.
  • 30:51I actually won't talk about these,
  • 30:53but there is both a PCORI methodology report
  • 30:57that provides some supportive detail
  • 30:59as well as an article that came out in JGIM
  • 31:02several months ago
  • 31:03that discusses each of these in more detail.
  • 31:06But it's the issue of core functions
  • 31:08that I wanted to talk about for a bit.
  • 31:12And again, the underlying motivation
  • 31:13is the fact that complex interventions can be adapted.
  • 31:17They will be adaptive irrespective of our efforts
  • 31:20to achieve fidelity,
  • 31:22but more importantly they should be adapted.
  • 31:25Now I'll often say adaptation happens.
  • 31:27We should embrace it and study it and ultimately guide it.
  • 31:30We should not be trying to ignore or suppress it.
  • 31:33So the concepts of core functions and forms
  • 31:37were introduced by Penelope Hall
  • 31:38a good 15 years ago
  • 31:41without a whole lot of attention
  • 31:43and follow-up activity in the intervening years
  • 31:46until relatively recently
  • 31:48where researchers who study complex health interventions,
  • 31:51implementation strategies,
  • 31:53health promotion programs began to realize
  • 31:55that they have a lot of relevance and value.
  • 31:58And this is a short list of publications.
  • 32:03There actually are many more
  • 32:04just within the last year or two
  • 32:07that have applied concepts of core functions and forms.
  • 32:11So forms are the specific detailed activities.
  • 32:16So if we think about physical activity as a broad category,
  • 32:19walking, running, swimming
  • 32:20are all examples of physical activity.
  • 32:23And the argument is that our manualized intervention
  • 32:27should not specify 20 minutes of walking,
  • 32:31but instead should specify physical activity.
  • 32:34In the case of patient education,
  • 32:36the underlying core function again
  • 32:38is to educate patients and their parents.
  • 32:40The different forms we can use are listed here.
  • 32:44And again, selecting a form
  • 32:46that matches the particular features of the target audience
  • 32:50is important in increasing fidelity.
  • 32:53So we should not be providing a script,
  • 32:55a strict script,
  • 32:57but instead laying out the goals of the education
  • 33:01and providing a menu of different strategies
  • 33:03for achieving those goals through different kinds of forms.
  • 33:09So I won't spend a lot of time on this,
  • 33:12but encourage those of you interested
  • 33:14to both look at the articles
  • 33:15as well as these slides in more detail.
  • 33:17But the general approach that we advocate
  • 33:21is to think about, again, a set of core functions
  • 33:26and think through all the different kinds of forms
  • 33:29that might be available
  • 33:31to operationalize those core functions.
  • 33:34And then, also think about how to decide
  • 33:36which form to select,
  • 33:38and that's the purpose of research.
  • 33:40In addition to identifying
  • 33:42and describing the core functions,
  • 33:44also to provide guidance for the local tailoring,
  • 33:48which item from the menu is optimal for particular setting.
  • 33:53Now we know that if we think only
  • 33:55about core components or activities,
  • 33:58we often can go down the route
  • 34:02of modifying those forms or components
  • 34:04in what appears to be a very minor way,
  • 34:06but in fact completely eliminate
  • 34:09achievement in one of the core functions.
  • 34:11So if you think about drug detailing or academic detailing,
  • 34:16that academic detailing interaction
  • 34:18conveys information and education and knowledge,
  • 34:22but also conveys professional norms.
  • 34:24So that's an activity or a form
  • 34:26that actually operationalizes two core functions.
  • 34:30Audit and feedback is another example.
  • 34:32Audit and feedback conveys information,
  • 34:35but it also conveys professional norms
  • 34:37and leadership expectations.
  • 34:39And if you're focused only on the information function,
  • 34:43you could easily decide,
  • 34:45rather than convey the audit and feedback information
  • 34:48in a departmental meeting,
  • 34:50to convey that information via memo.
  • 34:53If you do that, you weaken the professional norm function.
  • 34:57One of the advantages of an audit and feedback session
  • 34:59that's conducted in the departmental meeting
  • 35:02is the physicians have an opportunity
  • 35:04to talk about the guideline,
  • 35:06the performance metrics or benchmarks
  • 35:08and the variation of performance,
  • 35:11and ideally help convince each other
  • 35:13that maybe there is room for improvement.
  • 35:15Whereas if you receive your own performance via a memo,
  • 35:19you're not likely to be influenced in quite the same way.
  • 35:23One final example,
  • 35:24and that is quality improvement collaboratives
  • 35:27where we know that having a multidisciplinary team
  • 35:31and ensuring that the collaborative that's focusing,
  • 35:33for example, on high contamination rates
  • 35:40in the OR,
  • 35:42that the QI team needs not only surgeons and nurses,
  • 35:45but also members of the housekeeping staff
  • 35:49because they collectively will develop
  • 35:52a better understanding of the root causes
  • 35:55of those infection rates of contamination
  • 35:58than the physicians alone.
  • 36:00But the other core function of that multidisciplinary team
  • 36:04is the focus on acceptability
  • 36:06of the findings and the recommendations.
  • 36:09If a QI team consisting only of surgeons comes out
  • 36:12and says that the high infection rates
  • 36:14are due to the fact that the housekeeping staff
  • 36:16are not wiping down the walls properly,
  • 36:18you can be sure that the housekeeping staff
  • 36:20are going to discount that
  • 36:23because they know that they do their job properly
  • 36:25and in their minds the problem
  • 36:27is that the hand washing practices
  • 36:29of the surgeons are deficient.
  • 36:32So again, it's a single component
  • 36:35or feature of intervention
  • 36:37that operationalizes two different core functions.
  • 36:40And understanding those core functions
  • 36:43allows us to avoid making mistakes
  • 36:45when we modify the intervention activity
  • 36:48in a way that we think may be minor,
  • 36:50but again can completely eliminate its ability
  • 36:54to successfully carry out one of the core functions.
  • 36:59So I think I've covered each of these already,
  • 37:01but let me walk through them quickly.
  • 37:02Again, the way that we typically think
  • 37:04of a manualized intervention is highly detailed,
  • 37:07is in fact more likely to do harm in many cases than value.
  • 37:14Core components should be replaced by core functions.
  • 37:17There are many implications of that rethinking,
  • 37:19one of which of course is that a measurement of fidelity
  • 37:22is not a measurement of whether you followed the script,
  • 37:25but instead whether you successfully operationalized
  • 37:28or carried out the core function.
  • 37:30I've already talked about the fact
  • 37:32that main effect estimates are not very helpful
  • 37:35in evidence as we typically think of it.
  • 37:37And again, it gets back to the point
  • 37:39I've made a couple of times
  • 37:40on the need to rethink the purpose of our research.
  • 37:44So let me wrap up with just a few more slides
  • 37:48that list some of the kinds of analytic approaches
  • 37:51and research approaches that we need to be leveraging
  • 37:55in order to, again, shine our flashlight
  • 37:58on the processes and the mechanisms of effect
  • 38:01rather than outcomes.
  • 38:03These are some of the quantitative methods,
  • 38:06qualitative comparative analysis is becoming more popular.
  • 38:10There are also, of course,
  • 38:11a number of qualitative methods as well.
  • 38:14Process evaluation, theory-based evaluation
  • 38:17and the continued emergence
  • 38:20and illustrations of approaches to adaptation.
  • 38:24Here are some examples of publications
  • 38:26that are now quite dated that illustrate
  • 38:30and talk about some of these approaches
  • 38:32for measuring and taking into account context
  • 38:36for examining moderator effects and mediator effects
  • 38:39and mechanisms of effect.
  • 38:42Here's some examples of implementation studies
  • 38:45that have embraced and studied adaptation
  • 38:49rather than suppressing it or ignoring it.
  • 38:52Theory-based evaluation, realistic evaluation,
  • 38:55again, are relatively new
  • 38:57or underutilized approaches in the qualitative realm
  • 39:01to study mechanisms of effect.
  • 39:04There's still a lot of development work
  • 39:06to be done, in my view, in these methods
  • 39:08to get to the level of transparency
  • 39:10and reproducibility that we need,
  • 39:14but valuable approaches.
  • 39:17And then, if we look outside
  • 39:19the typical conventional toolkit
  • 39:22to some of the other approaches
  • 39:23such as statistical process control that are implemented,
  • 39:26I'm sorry, our improvement science colleagues use
  • 39:29as well as others,
  • 39:31these represent other approaches.
  • 39:33So just to wrap up,
  • 39:35when we study complex health interventions,
  • 39:38again, we need to begin by identifying the core functions
  • 39:41and developing the menu of forms.
  • 39:43Ideally, our research would validate
  • 39:47our list of core functions
  • 39:48or allow us to revise it
  • 39:50so that we understand all the functions
  • 39:51that need to be included
  • 39:53and also provide evidence that guides the local tailoring.
  • 39:59It would be documented
  • 40:01in a set of adaptation or tailoring algorithms.
  • 40:04And the bottom line, again, is a goal of understanding
  • 40:07how complex interventions achieve their effects
  • 40:10and how to modify them
  • 40:11rather than pursuing the simpler question
  • 40:14of whether they are effective.
  • 40:16So I will stop there and open up
  • 40:19for what I hope will be some robust discussion
  • 40:23and comments and questions.
  • 40:30<v ->Thanks, Brian.</v>
  • 40:32That was a really kind of interesting and important talk
  • 40:36at sort of the cutting edge
  • 40:37of where implementation science is today,
  • 40:41and with a lot of information packed in
  • 40:43and concepts and things like that
  • 40:46for us to think about
  • 40:47and potentially absorb into our own work.
  • 40:51So maybe I might start with this first question
  • 40:55which is...
  • 40:57Actually, I'll even integrate something else
  • 41:00that probably is worth both of us mentioning,
  • 41:01which is that Brian is actually the co-founder and director
  • 41:05of the Multilevel Training Institute
  • 41:07that's offered every year in collaboration with NCI.
  • 41:11And Raul Hernandez-Ramirez,
  • 41:13who's one of our CMIPS primary faculty
  • 41:16is a graduate of that institute.
  • 41:17And I'm actually one of the instructors
  • 41:20teaching about the analysis of multilevel interventions.
  • 41:25But the question is,
  • 41:27my thought and I think a lot of us out here think
  • 41:30that the reason,
  • 41:31sort of the opposite of what you said,
  • 41:34that the reason we like multilevel interventions
  • 41:36is because it seemed like the medical model
  • 41:39of isolating one,
  • 41:41like what we would've said before in the past component
  • 41:44and maybe right now you might say function
  • 41:47and studying it and holding everything else constant,
  • 41:51these sorts of implementation studies
  • 41:53have been disappointing.
  • 41:55And so, the thought was that actually,
  • 41:59first of all, it's totally unrealistic in real life.
  • 42:01You don't just have one thing.
  • 42:03All of these public health interventions are complex,
  • 42:06whether we choose to study them or not.
  • 42:08And so, the idea then evolved
  • 42:11to that it might make sense
  • 42:14to intervene on an entire,
  • 42:16sometimes we might say package of components,
  • 42:18which now you're kind of redefining
  • 42:21as package of forms maybe.
  • 42:24Or maybe it's package of functions
  • 42:26and then the forms are the specific ways
  • 42:29that the functions are ways
  • 42:31that functions can be implemented.
  • 42:33If I caught all of that very quickly, I think I did,
  • 42:35and I think we're somewhat familiar
  • 42:36with this idea in our center as well.
  • 42:39So that would strengthen
  • 42:41the ability to see an impactful intervention
  • 42:45and allow us to translate
  • 42:48into practice evidence-based interventions as a whole.
  • 42:53But now you're saying
  • 42:54that because of the adaptations
  • 42:57and the variability and the heterogeneity,
  • 42:59actually these kinds of approaches also
  • 43:02are giving weak results.
  • 43:04So I'm just wondering if you can comment on that.
  • 43:07<v ->Sure.</v>
  • 43:08No, I would concur with everything that you've said
  • 43:11and I think the main point is that,
  • 43:15it's really that the argument that one size doesn't fit all.
  • 43:18But to begin with, I do agree that,
  • 43:20for most of these kinds of problems,
  • 43:22the barriers are multi-component and multi-level,
  • 43:25and we definitely need multi-level,
  • 43:27multi-component complex health interventions.
  • 43:30The simple example goes back to the studies
  • 43:32in the 1970s and 1980s of CME
  • 43:36as a method for improving physician practices.
  • 43:39And the dominant finding from that body of research
  • 43:41was physician knowledge and education changed
  • 43:44and sometimes physician attitudes changed,
  • 43:47but practices didn't change at all
  • 43:49because the practices are held in place
  • 43:51by multiple barriers.
  • 43:52And if you don't provide the equipment
  • 43:54or the staff support or the time
  • 43:56or as I said, work on the patient resistance,
  • 43:59no amount of educating physicians is likely
  • 44:02to lead to the outcomes that we want.
  • 44:05And in the causal diagram,
  • 44:07the diag that I showed again is an example of that.
  • 44:09If we focus on only one of those causal pathways,
  • 44:12believe the other's untouched.
  • 44:14So the point though is that we absolutely do need
  • 44:19multi-component, often multi-level interventions.
  • 44:23The issue though is the need to adapt and tailor them
  • 44:26and the same mix of components
  • 44:28or the same mix of forms and activities
  • 44:31that is highly effective in one setting
  • 44:33is not likely to be effective elsewhere.
  • 44:35And when we take a complex health intervention
  • 44:38and we try to scale it and spread it,
  • 44:41or we move from efficacy research to effectiveness research,
  • 44:45we are often disappointed in the findings.
  • 44:48And that is because of the erroneous belief
  • 44:52that one size fits all
  • 44:53and that an so-called evidence-based practice
  • 44:56is going to be evidence-based and robust
  • 44:58and effective across multiple settings.
  • 45:00It has to be tailored
  • 45:02and we as researchers have to guide that tailoring.
  • 45:07<v ->Thanks.</v>
  • 45:08I have lots of questions,
  • 45:09but I don't wanna hog the time.
  • 45:11So we have lots of people on here.
  • 45:14Do others have any questions they'd like to ask?
  • 45:17I think you can simply unmute yourself
  • 45:19and ask your question.
  • 45:26<v ->Sure, thank you for the wonderful talk.</v>
  • 45:29I am an investigator working a lot
  • 45:30in low and middle income countries
  • 45:32and I was interested at the beginning of your talk
  • 45:34when you were using the term impact.
  • 45:36I think it's often used as a synonym for effectiveness.
  • 45:40But I think sometimes with public health
  • 45:42or even population health interventions,
  • 45:44we're thinking about numbers of people that can be served.
  • 45:47I think this is particularly relevant
  • 45:48if you think about communicable diseases
  • 45:50because there may be indirect benefits for addressing that.
  • 45:52And so, I guess that's sort of a very general question,
  • 45:56just curious if you've encountered that,
  • 45:57but maybe the more specific question
  • 45:59related to your research would be,
  • 46:03if volume then is kind of really important
  • 46:05about how we deliver interventions,
  • 46:07how do you think about that
  • 46:09with regard to understanding the fidelity?
  • 46:11Are we looking at sort of the contextual factors
  • 46:14related to how many people are served
  • 46:17maybe when we pilot an intervention,
  • 46:19but when we think about taking it to scale,
  • 46:22what are some of the considerations
  • 46:24about kind of understanding the impact of volume
  • 46:26on fidelity and adaptation?
  • 46:29Thank you so much.
  • 46:30<v ->Yeah, so first of all,</v>
  • 46:33I'm a strong fan of the REAM framework
  • 46:37and I think to think
  • 46:38about the different dimensions of impact
  • 46:41and how they relate to one another is critically important,
  • 46:44that we focus only on effectiveness
  • 46:46and ignore the other issues.
  • 46:48The ultimate societal impact
  • 46:50that we are seeking will not be seen.
  • 46:53And I think the heterogeneity
  • 46:56may apply differently across different outcomes
  • 46:59in the kind of approach that we need
  • 47:01in order to engage a high volume
  • 47:04and a high proportion of the target audience
  • 47:07versus the approach that we need to use
  • 47:09to ensure that the intervention is effective
  • 47:12across a large proportion of the target audience
  • 47:16which does have its own heterogeneity in subgroups.
  • 47:21Those may be different.
  • 47:25And this may or may not be an answer,
  • 47:27but at least this is the way that I would think about it,
  • 47:30the vast majority if not all of these studies,
  • 47:32we need to begin with REAM
  • 47:34and explicitly think about all the different dimensions
  • 47:38that contribute to the overall impact and outcomes.
  • 47:42And then, we need to recognize and anticipate
  • 47:45and explicitly address the heterogeneity
  • 47:49across all of those different dimensions
  • 47:52and know that as we again scale up and spread
  • 47:55and adapt and tailor interventions
  • 47:57from one setting to another,
  • 48:01that tailoring and adaptations are likely to be needed
  • 48:05in different ways and different facets of the intervention
  • 48:08in order to ensure that we maximize
  • 48:13outcomes and success
  • 48:14across all the REAM dimensions.
  • 48:17So again, it's just a very different way
  • 48:20of thinking about research and interventions
  • 48:23compared to the typical evidence-based practice
  • 48:26that we develop an intervention,
  • 48:27we can describe it very simply
  • 48:29and we can deploy it anywhere
  • 48:32and we will see the same kinds of results.
  • 48:35And that's not the case,
  • 48:36both because we can't deploy the interventions
  • 48:39as we were designed elsewhere,
  • 48:41and that's especially true of course.
  • 48:42And we take US-designed interventions,
  • 48:46try to deploy them in low resource settings
  • 48:48within the US and elsewhere,
  • 48:50but even if we could deploy them
  • 48:52in the same way and implement them,
  • 48:55the effectiveness is likely to vary considerably.
  • 49:01<v ->Thanks so much, that's really interesting.</v>
  • 49:04<v ->Okay, thank you.</v>
  • 49:05That was a good answer.
  • 49:06Luke, do you have a follow-up question?
  • 49:08<v ->Yeah, I mean I would be curious</v>
  • 49:09maybe just to think a little bit
  • 49:12about operationalizing some of these things.
  • 49:14Obviously, you work with one of the premier organizations
  • 49:18about thinking about how to answer these questions.
  • 49:20In terms of Kaiser,
  • 49:23very large health system, many different units,
  • 49:26however you'd wanna define those,
  • 49:27whether those are sites or providers and so on so forth.
  • 49:32I know I'm just curious
  • 49:35about how you think about integrating
  • 49:36quantitative and qualitative data
  • 49:38with respect to certain types of problems of this nature.
  • 49:41Maybe if there's any examples
  • 49:43that you might be able to share from your work in Kaiser.
  • 49:46<v ->So I think that integration is critical.</v>
  • 49:49And this actually relates to the point that Donna raised
  • 49:52about implementation science and improvement science.
  • 49:54So the improvement science folks do accept
  • 49:57and anticipate and address the heterogeneity
  • 50:01and the whole issue of rapid cycle implementation
  • 50:05and improvement
  • 50:07where you do something and you sort of see
  • 50:09what the impacts are and then you refine it.
  • 50:11That's a form of tailoring.
  • 50:13So I think they do recognize the heterogeneity
  • 50:16and that's one way of dealing with it.
  • 50:18But I think that rapid cycle evaluation,
  • 50:20any kind of evaluation,
  • 50:21and understanding the mechanisms of effect.
  • 50:25We can only learn so much through mediation analysis
  • 50:29and other quantitative methods.
  • 50:30And if our goal is ultimately
  • 50:32to understand how the world works
  • 50:34and to understand causal pathways
  • 50:36and causal relationships,
  • 50:39we do have to mix the quantitative and qualitative.
  • 50:42And I think all of our projects at Kaiser
  • 50:45that are embedded research projects
  • 50:47that are a synthesis
  • 50:49of quality improvement activity and approaches
  • 50:52where we're trying to improve things in the near term
  • 50:54and implementation science and scientific approaches
  • 50:58where we're trying to generate scientific knowledge,
  • 51:01I think we almost invariably combine
  • 51:03quantitative and qualitative
  • 51:05as a way of, again, trying to understand
  • 51:07how the world works,
  • 51:09try to design the intervention, deploy it,
  • 51:12evaluate it early and often
  • 51:14in order to refine it and tailor it
  • 51:17and ultimately generate
  • 51:18the summative evaluation findings as well.
  • 51:20And I think we all need more guidance
  • 51:23and more examples of how this is done
  • 51:26because there are a lot of moving parts
  • 51:28and a lot of different factors to think of,
  • 51:30not only the REAM multiple dimensions,
  • 51:32but the different kinds of data
  • 51:34and the different ways of understanding
  • 51:37and tracking the mechanisms of effect
  • 51:39and the intermediate or proximal outcomes
  • 51:41in addition to the distal outcomes.
  • 51:44So lots of challenges,
  • 51:45but lots of opportunity for innovation and creativity.
  • 51:52<v ->Anyone else that wants to ask you questions?</v>
  • 51:58While we're waiting to see, I have another question.
  • 52:01So you're probably familiar, Brian,
  • 52:04with Linda Collins' MOST approach
  • 52:07to developing and assessing or testing interventions
  • 52:12and her focus is also with complex multilevel interventions
  • 52:18where there are the three phases.
  • 52:20But in the third phase,
  • 52:22the third phase is kind of a traditional,
  • 52:27fixed.
  • 52:28You call it manualized,
  • 52:29it doesn't necessarily have to just be a manual,
  • 52:31it could be other things,
  • 52:33but fixed set of components at certain levels
  • 52:38and that's kind of tested in a standard RCT-type approach.
  • 52:42And I'm wondering if you think
  • 52:44that the MOST design is useful in certain settings
  • 52:49or do you think that maybe that kind of approach
  • 52:53has kind of seen its better days
  • 52:55because of the fact that it doesn't take into account
  • 52:59sort of the contextual aspects and the need for adaptation.
  • 53:03But I know among other circles,
  • 53:04the MOST design is very popular
  • 53:07and even we've had training in our center in MOST
  • 53:10and I'm currently discussing
  • 53:13a possible grant application with some investigators here
  • 53:16who'd like to use MOST.
  • 53:17So I'm just wondering what your thinking is about that.
  • 53:20<v ->Yeah, so I think that approach I think is highly valuable.</v>
  • 53:24As our standard RCTs,
  • 53:26as long as we recognize that they need to be augmented
  • 53:30with additional kinds of data collection analysis activities
  • 53:34that try to understand the mechanisms of effect.
  • 53:37But certainly and when I say
  • 53:40that RCTs and a focus on impact and outcomes
  • 53:44are not what we should be doing,
  • 53:48that's probably too extreme.
  • 53:49I think we obviously need to measure outcomes
  • 53:52and we need to use traditional quantitative methods,
  • 53:55but we need to augment them
  • 53:57and have an equal, in some cases greater focus,
  • 53:59on the mechanisms of effect.
  • 54:01And certainly, to try to get a handle
  • 54:03on some of the heterogeneity
  • 54:04and some of the factors using MOST
  • 54:06and other approaches that Linda and others
  • 54:09have developed and advocated
  • 54:11I think is quite important and valuable.
  • 54:14I just think that they're incomplete
  • 54:15and we need to make sure
  • 54:16we, again, have the accompanying process evaluation
  • 54:20and mediation analysis and others.
  • 54:23<v ->That makes sense.</v>
  • 54:26Anyone else?
  • 54:28Okay, so we have a question in the chat.
  • 54:33What are some other frameworks
  • 54:35you would consider reviewing
  • 54:37when considering a multi-component and multi-level study?
  • 54:40So presumably this question is other than REAM.
  • 54:45<v ->Sure.</v> <v ->I'm guessing</v>
  • 54:46and if the person who asked it would like to elaborate
  • 54:48or if this is totally clear to you,
  • 54:50Brian. <v ->Yeah,</v>
  • 54:51let me give it a quick shot
  • 54:53and then you can elaborate if I'm missing a point.
  • 54:55But I think that CFIR of course,
  • 54:58which continues to be the go-to high level framework
  • 55:02is a way of identifying
  • 55:03all the different categories of factors.
  • 55:05So multi-component, multi-level interventions
  • 55:08address multiple sources of barriers and factors
  • 55:13that influence the outcomes that we're interested in.
  • 55:16And CFIR is, in my mind, the best organizing framework
  • 55:19that gives us that sort of 50,000 foot level.
  • 55:22Then when we've identified
  • 55:23the different categories of factors,
  • 55:27we may need to bring in accompanying frameworks.
  • 55:30So, behavior change wheel,
  • 55:31theoretical domains framework is quite useful in identifying
  • 55:37some of the physician-level behavioral factors.
  • 55:40So I think it does depend on what CFIR tells us.
  • 55:44If many of the barriers and influences
  • 55:46are regulatory or community,
  • 55:48then we may need to bring in political science frameworks
  • 55:51or other bodies of theory.
  • 55:53But we start with CFIR to get sort of the lay of land
  • 55:56and then we identify frameworks for subsets of factors.
  • 56:00And to me, REAM and I think to most of us
  • 56:02is more of an evaluation framework.
  • 56:04It doesn't really give us the theory,
  • 56:08but it directs our attention
  • 56:09to the different categories of outcomes
  • 56:11that we need to take into account
  • 56:14and measure and attempt to improve.
  • 56:18<v ->Great, that was a very clear answer.</v>
  • 56:21Thank you so much.
  • 56:22And we're at time, so I think we'll just thank our speaker.
  • 56:25Hopefully we'll see him again in person
  • 56:27sometime in the near future
  • 56:29and we look forward to some of our one-on-one meetings today
  • 56:33and in a few of the subsequent days.
  • 56:35So thank you so much, Dr. Mittman.
  • 56:38<v ->Okay, thank you all.</v> <v ->Bye everybody.</v>
  • 56:39<v ->Okay, bye-bye.</v>