Using Rapid Qualitative Research Techniques to Identify Factors Impacting the Pace of Implementation
February 14, 2024Speaker: Traci Abraham, PhD
Wednesday, November 3, 2021
Information
- ID
- 11310
- To Cite
- DCA Citation Guide
Transcript
- 00:00<v ->Reductions in criteria.</v>
- 00:05Attempts and ideation,
- 00:06including a one and two year follow up.
- 00:12And here's just an example.
- 00:14<v Donna>Oh, I'm sorry, Tracy. This is Donna Spiegelman.</v>
- 00:16I raised my hand- <v ->Oh.</v>
- 00:18<v Donna>Through the screen.</v>
- 00:20But I'm not sure how that's monitored,
- 00:22or if you can see whether people are raising their hands.
- 00:26But I had a question,
- 00:27if you could go back to the previous slide.
- 00:30<v ->Absolutely.</v> <v Donna>Just starting with</v>
- 00:31sending patients who are suicidal,
- 00:33and you mentioned something about screening for them.
- 00:36Could you say a little bit about how that's done?
- 00:38'Cause that can be a very big part of a project like this,
- 00:43especially when it's, like, scaled up.
- 00:46<v ->Absolutely, so in the VA, they have implemented</v>
- 00:49the Columbia Screener in emergency departments.
- 00:53And this is being implemented in the ED, by the way.
- 00:56So they are screened when they present
- 01:00to emergency departments using the Columbia.
- 01:03And that's already been in place,
- 01:06and we are building off of that.
- 01:11<v Donna>So every person who goes to a VA emergency room</v>
- 01:14as part of the standard of care,
- 01:17they're getting this screener?
- 01:20<v ->Absolutely.</v> <v Donna>And then its result</v>
- 01:22is entered into the computer? <v ->Absolutely. Yep.</v>
- 01:28Lots of screening at the VA.
- 01:29They get screened for all sorts of things.
- 01:35<v Donna>Okay. Thank you.</v> <v ->I can't see if anyone</v>
- 01:37raises their hand.
- 01:39Just throw something at me if that happens again.
- 01:42Okay, so these are examples of Caring Contacts
- 01:47that we adapted
- 01:52for veterans, of course
- 01:54with feedback from veterans.
- 01:59There is no Ms. Squirrel, of course.
- 02:01That's just the example.
- 02:02So you can see very non-demanding.
- 02:05We value your health and are honored to serve you,
- 02:07and it should say in the Emergency Department.
- 02:11And then they also get birthday cards,
- 02:13and they get another card on Veteran's Day.
- 02:20Of course, it's always helpful
- 02:23to have a framework borrowed from implementation science
- 02:26if you want to successfully implement something.
- 02:29And our framework is the i-PARIHS Framework.
- 02:33So the i-PARIHS has these different dimensions, right?
- 02:36And it hypothesizes it.
- 02:40It's this magical combination of context,
- 02:43innovation and recipient characteristics and qualities
- 02:48when they are joined with facilitation
- 02:53that results in successful implementation
- 02:55of a practice in a clinical context.
- 03:01And in keeping with that theoretical approach,
- 03:04we are using facilitation, the implementation strategy
- 03:08of external facilitation actually,
- 03:11to help implement Caring Contacts in the ED.
- 03:15And this is our study design.
- 03:17We bring the sites on in waves.
- 03:20Obviously you don't want to bring all 28 sites on at once.
- 03:25So well, what was intended, what was planned is
- 03:29that they would get six months of facilitation support
- 03:34and we would bring them in four waves.
- 03:37COVID hit.
- 03:39Taking a bit longer for some sites
- 03:40to implement than we thought it would.
- 03:43And we're actually moving into a fifth wave
- 03:45I think fairly soon.
- 03:48So things have changed as they often do
- 03:51in research in the real world.
- 03:56It's a mixed methods evaluation.
- 03:58I won't cover the quantitative part,
- 04:00because frankly we could spend all day just talking
- 04:03about the entire approach to getting this project
- 04:06off the ground.
- 04:07We'll just talk about the qualitative eval.
- 04:11And the overarching aim is really to identify
- 04:14the contextual recipient and innovation factors
- 04:17that impact implementation, and that's pretty easy.
- 04:22The challenge that was put to me was understand how barriers
- 04:27and facilitators impact implementation as a process.
- 04:33And that was reiterated to me multiple times,
- 04:37implementation as a process over and over again.
- 04:44And so I had to think of how exactly would I go
- 04:47about understanding how barriers
- 04:51and facilitators impact this process.
- 04:55'Cause of course we collect data at discreet time points,
- 05:00and what you have is a window into one time point.
- 05:03So how do you move that into a process?
- 05:10For the part of the evaluation that I'll be talking about,
- 05:16what we're doing is leading 30 minute dyadic debriefs
- 05:20with a team of external facilitators.
- 05:23They're biweekly until the first Caring Contact is sent.
- 05:29And then once the site moves into sustainment,
- 05:32we switch to monthly debriefs.
- 05:36As of today, we've led 100, as of September,
- 05:39I should say September 10, we had about 169.
- 05:43I think we're close to about 180 at this point.
- 05:48And then we're also leading interviews
- 05:50with stakeholders and veterans,
- 05:52but I won't be talking about that moving forward.
- 05:56These are what the debriefs look like,
- 05:58or at least a snapshot of part of a debrief.
- 06:02It's the typical sort of probing around,
- 06:05what are the barriers and facilitators.
- 06:08And of course as we move forward because they're biweekly,
- 06:12they change, and they also change significantly
- 06:17depending on the time point.
- 06:19So we have a certain debrief that we used to collect data
- 06:22while they're actively implementing.
- 06:26And then we have another one when
- 06:27the implementation plan has been finalized.
- 06:32And then we have another one that we bring in
- 06:35after the first Caring Contacts have been sent
- 06:38to understand what is going on
- 06:42in these clinics while they're trying to implement.
- 06:50To meet this challenge
- 06:52of approaching implementation
- 06:56as more of a process and as discreet time points,
- 07:02I have chosen really a two-part approach
- 07:06to analyzing our data.
- 07:09And that's first we are
- 07:14the qualitative researchers
- 07:17that are leading the debriefs.
- 07:20Very quickly record key implementation events
- 07:26and other important things that are happening
- 07:28during implementation in these brief narrative summaries,
- 07:32which I translate into data visualizations
- 07:36that help us understand where the process
- 07:39of implementation progressed and where it stalled.
- 07:45Secondly, we are templating each debrief
- 07:50using individual templates
- 07:53which essentially organize and reduce our data.
- 07:58Then we are synthesizing individual templates
- 08:02into site matrix displays that further reduce the data
- 08:07and help me make comparisons
- 08:09across time points and across sites.
- 08:13And these I'm translating into case studies.
- 08:17That's how I'm presenting the findings
- 08:19that come from the matrices.
- 08:24I don't wanna go through the analytic approach
- 08:28one by one because I'm worried
- 08:32that we won't get through the whole presentation in time.
- 08:35But what I really want to emphasize to everyone on here
- 08:40is that time was actually the least important reason
- 08:44for me to choose rapid analytic approaches for this project.
- 08:49I knew that we would have enough time to analyze the data.
- 08:53It was really more about the research questions, the goals
- 08:57and the challenges collaborating across sites.
- 09:02Two of my analysts are located,
- 09:06sorry, one of my, no, two of them.
- 09:09One is in Iowa City and one is in California.
- 09:12And sharing analytic programs in the VA
- 09:16can get really problematic.
- 09:19What templating allows us to do is get around those problems
- 09:23because templates are developed in Word documents
- 09:26and that's much easier in the VA.
- 09:30So that was really honestly my primary concern
- 09:33is if this thing stalls because we run into problems
- 09:37with the program and sharing across sites,
- 09:40we are in a lot of trouble.
- 09:42That was the primary consideration
- 09:45rather than the timeframe for this study.
- 09:50And to emphasize that this is a team effort,
- 09:55this is how we manage every step in the analytic process.
- 09:59You can see the debrief on the left.
- 10:03This is for one site, 695,
- 10:07and then each person along the way
- 10:09has their own role, right?
- 10:10It's audio recorded by the person leading the debrief,
- 10:14then transcribed, and then there are these different steps
- 10:18in the analytic process.
- 10:19It's first templated.
- 10:22So each debrief is analyzed independently.
- 10:26And then I begin bringing those data
- 10:30into the matrix to synthesize it.
- 10:33And what often happens is I have questions, I have comments
- 10:37and using the track changes
- 10:39and the comments feature of order,
- 10:41I can create this dialogue with that primary analyst
- 10:45to help me understand what's going on in that template
- 10:48before I move it into the matrix.
- 10:50And then once we've gotten through that process,
- 10:54then I can indicate that the matrix is complete.
- 10:57But it is a process and it takes a heavy lift,
- 11:00a lot of collaboration, a lot of coordination.
- 11:08Earlier I spoke of these ongoing brief narrative summaries.
- 11:12These are what the researchers immediately
- 11:15after they've led a debrief,
- 11:19they record their observations and they need these actually
- 11:22not for analysis so much
- 11:24except for the key implementation events,
- 11:27which I bolded in this for you.
- 11:31What they really need it for
- 11:32is to remind the external facilitators
- 11:35of what they talked about during the prior debrief,
- 11:39because they're doing so much work across so many sites
- 11:42that people just can't always remember
- 11:44what they said the time before.
- 11:46So it serves a dual purpose
- 11:49of recording these key implementation events
- 11:52while helping team qual
- 11:55to keep the facilitators on track.
- 12:02And then we have templates.
- 12:04For those of you who haven't heard of template analysis,
- 12:08again, they're just Word documents that you use
- 12:10to summarize and organize content
- 12:14from individual interviews or focus group discussions.
- 12:20They can be theoretically informed
- 12:21as the one I'm going to show you has been.
- 12:24Or they can be goal-oriented, meaning, you know,
- 12:28just let's say you want to adapt
- 12:35a program that is delivered via internet
- 12:40to veterans and you need to collect data
- 12:44that will help you adapt that program.
- 12:46So you might just organize your template instead
- 12:50of using theoretical domains or categories.
- 12:54You can say look and feel changes or recommendations
- 12:59or language recommendations if they don't
- 13:02like the language that's used.
- 13:05So they focus analysis.
- 13:07You generally start with some deductive domains
- 13:10and categories while permitting discovery.
- 13:14And you'll understand that
- 13:15a little bit better in just a second.
- 13:18Most importantly from my perspective
- 13:21is that when you're an anthropologist working
- 13:23in implementation science, templates can provide
- 13:27this common language between you
- 13:31and the other people on your team
- 13:34that can help you communicate with them
- 13:37and they can help you translate your findings
- 13:39much more efficiently down the road,
- 13:42whether it's in presentations or publications
- 13:45or in improving a process.
- 13:49So this is an example of what I call a master template.
- 13:53And it's just a template that everyone goes into
- 13:56and makes a copy of before they begin analysis.
- 14:02This one has some deductive categories in black,
- 14:07and you can see the inductive categories,
- 14:11domains, here you go, domains from the i-PARIHS,
- 14:15categories and subcategories over here.
- 14:18The ones that are in blue are subcategories
- 14:22that we have developed inductively
- 14:25while analysis is ongoing.
- 14:28So you begin deductive and you build in
- 14:33your inductive categories and subcategories as you proceed.
- 14:37You bring them into the master.
- 14:39And then when it's time for the next episode,
- 14:44if you would, data collection episode
- 14:47to be templated, they have it here.
- 14:50They make a copy of the master and away they go.
- 14:54And this is what a template looks like
- 14:56after it's been completed.
- 14:59You have your domains from the i-PARIHS, again,
- 15:05some more categories and subcategories.
- 15:09For this one, you can see they don't have as many
- 15:11categories and subcategories, 'cause when you don't use them
- 15:16for any particular template, you delete them off.
- 15:21You only keep the ones that emerged
- 15:24during that data collection episode.
- 15:27And then you have your content from your debriefs
- 15:30or your interviews or your focus groups.
- 15:32However it is that you are collecting data.
- 15:36So you go into your transcript
- 15:38and literally just copy and paste, and often as you can,
- 15:43I think you can probably see my pointer.
- 15:45Can you? Yes.
- 15:49You need to get a little bit creative, right?
- 15:51<v Participant>Yes, we can see the pointer.</v>
- 15:52<v ->Great, thank you.</v>
- 15:54We all know what qual interviews are like.
- 15:57If it's a good qualitative interview, you ask a question
- 16:00and people talk and talk and talk.
- 16:04And so you have to reduce those data.
- 16:07If you're putting, you know, oceans of words
- 16:10on your template, your template can't tell you a story,
- 16:14because you'll be drowning in those words.
- 16:18Right, the beauty of a template is
- 16:20that it reduces those data for you.
- 16:23And what you see, look at this,
- 16:25what you see is this beautiful coherent story
- 16:28of one data collection episode of what was happening
- 16:32at that site, at that point in time.
- 16:36The barriers and facilitators that either helped
- 16:39or hindered implementation of this particular practice,
- 16:43in this particular clinic, in this particular point in time.
- 16:49So delete off everything that didn't apply,
- 16:52keep everything that did, reduce it down to its bare essence
- 16:59and it tells you this beautiful coherent story.
- 17:04In the hands of a skilled analyst, of course.
- 17:06It does take time to pick up these skills.
- 17:10And then after this, what do you do, right? Okay, great.
- 17:13So this is what happened at this clinic
- 17:16at this particular point in time.
- 17:18Now what do you do with it?
- 17:21Well, there are a few different things
- 17:22that you can do with it.
- 17:23And what I chose to do was synthesize and further reduce it
- 17:27in what's called a site matrix.
- 17:30And what makes our site matrices for this project
- 17:34a little bit different is that generally matrices tend
- 17:38to be organized by participants and by category,
- 17:43and you compare across participants.
- 17:48The way that I organized these was by time point,
- 17:50you'll see this in just a second,
- 17:52to allow me to get this processual insight
- 17:55into what's happening over time.
- 17:59And matrices give you a very broad overview,
- 18:04at least this particular matrix does,
- 18:08into when factors come into play,
- 18:10which factors are coming into play,
- 18:12and for how long they continue
- 18:15to impact implementation over time.
- 18:19And then on another tab,
- 18:24this is what a matrix looks like.
- 18:27On another tab, that is where the magic happens.
- 18:30That's where you put all your brilliant insights.
- 18:35All the, what I call qualitatively significant factors
- 18:39that impacted implementation.
- 18:43Those go on a separate tab. So here we go.
- 18:45It's in Excel form.
- 18:46<v Participant>So it's just one. Is it one site?</v>
- 18:49<v ->This is just one site, yeah.</v>
- 18:52One site, I organized it by time points across the x-axis.
- 18:57You can see there's been 18, at least for this one,
- 19:01at the time that I took this snapshot,
- 19:0218 different debriefs with facilitators from this site.
- 19:10And here are all those categories
- 19:12that you saw on the template.
- 19:18And then in these fields,
- 19:21I literally, again, just copy and paste
- 19:25the subcategories, the barriers and facilitators,
- 19:28and as much of the excerpts
- 19:31as I think I need or want.
- 19:35And another factor that has to come into play
- 19:38when you're populating matrices
- 19:43is how much data can you manage
- 19:46before you're going to get overwhelmed.
- 19:50I can wrap my head around a relatively large volume
- 19:54of words of qualitative data.
- 19:56So I tend to have matrices
- 19:59that have a lot of words in them.
- 20:02And I think other people not so much.
- 20:05So here we go.
- 20:06Here's an excerpt again, straight from the template.
- 20:15And there's the magical tab.
- 20:18That's actually, I put analysis there,
- 20:20but it's actually the results tab.
- 20:23So everything, what I do is scroll through that matrix,
- 20:26and look over time.
- 20:29I scroll up and I scroll down.
- 20:33And I take a good hard look at what is happening.
- 20:36What are the qualitatively significant
- 20:39barriers and facilitators?
- 20:41So not just what's happening, but what holds things up?
- 20:45That's what I mean by qualitatively significant.
- 20:49There are always all sorts of things going on in the clinic,
- 20:52but what is really holding things up
- 20:55or speeding them along?
- 21:00That's where you put your observations, if you would,
- 21:04of what's happening in the data right on this tab.
- 21:08And then I'm doing even, to make my job even harder,
- 21:12I've decided that I'm going to take data from stakeholders
- 21:16and compare what the facilitators are saying
- 21:19and see what I come up with,
- 21:21because I just, I don't know.
- 21:22I must have a masochistic streak in me or something.
- 21:26I just, nothing is ever good enough.
- 21:28I have to take everything to the next level.
- 21:32So what do you get then? What can you do with this?
- 21:36Well, one thing that you can do is build case studies.
- 21:39You can tell a story about implementation. Okay?
- 21:46I'm not going to go through
- 21:47all three of these case studies.
- 21:50I'll just show you one. We don't have time for all three.
- 21:54But early on in this project,
- 21:56I thought that a good idea would be
- 21:58to characterize implementation at each site.
- 22:01And so I called Case Study One, Rapid Implementation.
- 22:05It launched after only three months.
- 22:08Remember they were on a six month timeline.
- 22:11Case Study Two, I called Delayed,
- 22:14because they completed their formative evaluation on 5/29,
- 22:21but Caring Contacts didn't launch for another six months.
- 22:24So they experienced some delays.
- 22:27And this one I called Interruption, because it was delayed
- 22:31and then later delayed indefinitely at the site.
- 22:36So let's jump into...
- 22:38I'm no longer characterizing them by the way,
- 22:40because I've now discovered how
- 22:42to use these cool data visualizations,
- 22:44which we'll get to in a second.
- 22:47That's so much more informative about the process
- 22:50of implementation than characterizing them in this way.
- 22:55So Interrupted Implementation.
- 22:57What the heck happened at this site in this case study?
- 23:02Well, you know, honestly, this was a site
- 23:04where the facilitators anticipated
- 23:07implementation would be really easy,
- 23:10because they had a lot more facilitators
- 23:12than they had barriers.
- 23:14It was this frontier site, and of course, you expect,
- 23:18you know, these way out in the boonies,
- 23:20very rural sites to run into problems.
- 23:23But they had this incredibly cohesive clinic culture,
- 23:26and they had an influential
- 23:28and really motivated champion on site.
- 23:32And some key players, as in leaders,
- 23:36within the clinic were at the planning meetings.
- 23:39And this was the initial thoughts
- 23:41of the facilitators when we interviewed them.
- 23:44They're used to just sending each other things and tasks.
- 23:47And even though it's spread out, it meaning the clinic,
- 23:50it's in different locations,
- 23:52they really, truly work together.
- 23:54But they are quick, and they are cohesive,
- 23:57and they are really well integrated,
- 23:59I think, given their setting.
- 24:01So they had these incredible facilitators there
- 24:06that everyone thought would really help them.
- 24:09But then unfortunately, they ran into some hitches here.
- 24:13They were having a really hard time
- 24:14using the SPED dashboard, which is how you identify
- 24:17veterans who have been screened for suicidal ideation.
- 24:23They had a really hard time learning how to use it
- 24:26to identify the veterans that they needed
- 24:29to be sending the cards to.
- 24:31And then of course, COVID hit, and people were pulled
- 24:35and reassigned to other places in the clinic,
- 24:37and they needed to be retrained.
- 24:39And at this point, one of the facilitators said,
- 24:41"I'm not sure for how long we will be
- 24:43in the implementation phase,
- 24:45'cause we can't move forward 'til they're able
- 24:48to fix the health factor link to the Columbia."
- 24:51So they really just stalled,
- 24:52despite having every advantage it seemed.
- 24:56They stalled because of this issue with the SPED dashboard,
- 25:00and then of course, COVID.
- 25:03So that's one thing that you can do,
- 25:05is these beautiful case studies that help you tell a story.
- 25:09And we know that our brains, our human brains like stories.
- 25:13So this is a really powerful way to communicate
- 25:17what's happening to the rest of your team
- 25:19and make course corrections.
- 25:24But insofar as really understanding
- 25:27implementation as a process, these data visualizations
- 25:32are really what get you there.
- 25:34It's not these case studies.
- 25:36And what we're visualizing
- 25:38are five key implementation events
- 25:40that are plotted along a line graph.
- 25:42And these are not complicated visualizations.
- 25:48This is what we end up with.
- 25:53Remember those key facilitation implementation events
- 25:57are kept in those brief narrative summaries, right?
- 26:00So you just go back to those,
- 26:03look at the dates and plot them.
- 26:07And what we've done is use a color for each phase
- 26:11of implementation and one dot is for each month.
- 26:16And you can much more intuitively grasp
- 26:19for how long implementation was ongoing, right?
- 26:23This site has fewer dots than does this site.
- 26:28What that means is this site took longer
- 26:31to implement and reach sustainment.
- 26:34Not only that, but in using colors, you can see
- 26:38for how long these sites were in each phase.
- 26:43So one one dot equals one month.
- 26:45You can see this moved along
- 26:47fairly, fairly quickly to sustainment.
- 26:52And you can see right here, right,
- 26:55this is where they were hung up at this site.
- 26:58This is where the process stalled
- 27:00between this formative eval
- 27:02and the implementation planning meeting.
- 27:05And that's because they ran into these problems
- 27:08with the SPED dashboard
- 27:10and then with staffing difficulties.
- 27:14And then they very quickly,
- 27:15once they got roll in here, they very quickly moved on.
- 27:22So this is what's really allowing us insights
- 27:25into implementation as a process.
- 27:27And I think that I might take this one step up
- 27:31and actually start plotting the barriers
- 27:33and facilitators on here.
- 27:36I think that could be really helpful.
- 27:37The really qualitatively important ones.
- 27:41And then imagine how that's going to enable us
- 27:44to look across sites and see what's happening
- 27:48across these sites and see if we can
- 27:50pick up on any patterns.
- 27:56So what are we doing? What are we doing with these data?
- 27:59Of course it's fun just to play with the data, right?
- 28:01I mean, I'm more than happy just to play with data,
- 28:03if that's what you want me to do.
- 28:05But on a project like this, we want to use qualitative data
- 28:09to inform facilitation/implementation.
- 28:15And some of the recommendations that we fed back
- 28:18to the larger team is that they need
- 28:21to initiate contact with the leaders there on site
- 28:24early in the process of implementation.
- 28:27Because in the VA where you have this incredible hierarchy
- 28:31and you have your Caring Contact specialists,
- 28:34who are the ones identifying the veterans,
- 28:38getting the cards printed, signing the cards,
- 28:42and mailing the cards, they are really low down
- 28:45on this hierarchy, and they actually sometimes don't know
- 28:49who to even talk to, to get what they need.
- 28:52And they do not often have direct contact with leaders.
- 28:56So you need to involve leaders in very early on
- 28:58in the process, because they're the ones
- 29:00that help you get over these barriers.
- 29:05Stakeholders were saying that they weren't
- 29:08really clear on what the roles were,
- 29:11what was expected of them, how much time it would take
- 29:15to implement and what the costs would be.
- 29:18And so the feedback was to communicate more clearly
- 29:23with stakeholders upfront
- 29:26about these aspects of implementation.
- 29:29Ensure that the site leaders
- 29:31are both influential and engaged.
- 29:35So they have to be at the implementation planning meetings
- 29:38and they have to be talking with the people
- 29:39who are implementing the interventions
- 29:43and empower stakeholders with knowledge.
- 29:45Who do I talk to?
- 29:47Who do I go to if I need to get something printed?
- 29:54Some unexpected insights that speak really,
- 29:58to implementation in general
- 30:01is, you know, what constitutes a barrier at one site
- 30:06isn't necessarily a barrier at another.
- 30:09And what we sometimes think of as facilitators
- 30:13aren't always facilitators.
- 30:15So leadership involvement can mean many different things.
- 30:18It can mean helping and empowering stakeholders
- 30:21to implement something, or it can mean pressuring them
- 30:24to implement something.
- 30:27And you want leaders involved in the right way.
- 30:29Virtual facilitation, it works for some sites
- 30:31and not for others.
- 30:33Even sites with every advantage experience delays,
- 30:37as I said earlier.
- 30:39Implementation readiness, particularly in a time of COVID,
- 30:43it fluctuates over time.
- 30:45A site that's perfectly ready in one moment
- 30:47might not be a month down the road.
- 30:51I know there's a move in implementation science
- 30:53to try to measure implementation readiness.
- 30:57And this is a real challenge to that movement,
- 31:00because a site that's ready now
- 31:02won't necessarily be ready tomorrow.
- 31:06And what's a barrier or a facilitator,
- 31:10you know, the definition of that changes from site to site.
- 31:13So it's really complicated.
- 31:19And how do we establish rigor in this process?
- 31:24One thing that my colleagues really respect me
- 31:26for is my rigorous approach to qualitative research.
- 31:30And we establish rigor at multiple levels
- 31:33always during the process of analysis.
- 31:36First we, for this study, we independently templated
- 31:39the first three debriefs together
- 31:43and compared our debriefs to make sure our templates,
- 31:47to make sure we were all on the same page
- 31:50and everyone was interpreting the categories the same way.
- 31:55At that point, I felt like we were good to go.
- 31:58Every fourth template still
- 32:01is audited by a secondary analyst.
- 32:06When I'm working in the template or in the matrices,
- 32:10I can see if a template has been consistently,
- 32:15if content has been consistently organized
- 32:19in the template compared
- 32:20to what they were in the past, right?
- 32:22Because I have prior content
- 32:28from prior templates in that matrix.
- 32:31And so if I see a discrepancy in how people are
- 32:35defining those categories, I kick it back to them,
- 32:39and I explain to them,
- 32:40I think this is how we're interpreting this.
- 32:43What do you think?
- 32:45And then I use the Word's comments feature
- 32:48to initiate a dialogue with those primary analysts,
- 32:52not only to ensure that we're being consistent
- 32:55in templating, but also to verify my own insights
- 33:00into the data.
- 33:01Sometimes those come just looking at the templates
- 33:04rather than looking at the matrices.
- 33:06And I'll show you what that looks like in a second.
- 33:10This is a template after I've looked at it (laughing)
- 33:17and before I've moved it into a matrix,
- 33:20and they aren't always quite this hacked into,
- 33:24but you can see where I have
- 33:26further reduced the data on here.
- 33:28I don't need people's names. That's a lot of noise.
- 33:33I think I picked this up and moved it somewhere else.
- 33:35You can see I've done quite a bit in here.
- 33:38And for every edit, for every question,
- 33:41I leave a comment here for the primary analyst
- 33:45who was also the one who led the debrief,
- 33:47just to make sure that I'm understanding
- 33:49things correctly, right?
- 33:50Because I am now three people removed
- 33:52from the sources of data.
- 33:54And so this dialogue is so important,
- 33:58this constant dialogue with my team
- 34:01to help keep me on track.
- 34:03Not just them, but keep me on track
- 34:07that my insights are really valid.
- 34:09And of course, I occasionally kick things back
- 34:11to the larger team and the facilitators to verify with them.
- 34:20You know, I wanna be really forthright
- 34:22about these rapid methods.
- 34:24I know there's a lot of curiosity about them,
- 34:26and there's advantages and challenges to using them.
- 34:31With templates, like I said, they provide this beautiful,
- 34:35coherent story of a single data collection episode.
- 34:39It isn't like coding where you sort of fragment your data
- 34:42and then you have all these codes floating around.
- 34:47It's right there in a Word document.
- 34:50You can use them if you're using, you know,
- 34:53like, a framework from implementation science
- 34:56to structure your template.
- 34:57It creates this transdisciplinary language
- 35:00that an anthropologist can use
- 35:02to talk with other scientists.
- 35:05And your results are translated much more efficiently.
- 35:09They're translated by your template.
- 35:11If you're skilled at template analysis,
- 35:14your template will translate your data for you.
- 35:18The challenge really with individual templates is
- 35:21that they look very simple.
- 35:23People look at a completed template
- 35:26and they think, "Oh, I can do that.
- 35:28That looks easy."
- 35:29But it takes a remarkable amount of work
- 35:32to get it to that point, right?
- 35:35There's reducing the data.
- 35:36What do I cut out and what don't I cut out? That is a skill.
- 35:42There's what I call sinking in and reading deeper.
- 35:44You can't just stay at the surface.
- 35:47Sometimes you have to unpack what people are saying,
- 35:50and you still have to do that with template analysis.
- 35:53You can't get in copy and paste mode and just go.
- 35:56You have to continually sink in and dig deeper.
- 36:00And you know, the same challenge
- 36:01that you get with coding, right?
- 36:03Controlling the codes.
- 36:06You can get to the point where you have,
- 36:07if you let yourself run away with it,
- 36:10far too many subcategories
- 36:13to be really helpful to you at all.
- 36:16So you have to really control the proliferation
- 36:18of categories and subcategories.
- 36:21And of course, maintaining consistency
- 36:23across templates is always a challenge.
- 36:28Matrices, of course, they do reduce your data,
- 36:30and that's helpful when you're dealing
- 36:32with such a large data set.
- 36:34And it helps to ensure that consistency across templates.
- 36:39And you know, as you've seen it permits
- 36:41longitudinal comparisons and assists
- 36:44in the development of these case summaries.
- 36:49But those matrices have a lot on them.
- 36:53And you have to find your way through
- 36:55all those words, words, words, right?
- 36:57I mean, it's still just a lot to deal with.
- 36:59And the matrices don't analyze the data for you.
- 37:03You don't stick them in the matrix
- 37:05and suddenly the magic happens.
- 37:07Your brain does that. It's still qualitative research.
- 37:11You have to make sense out
- 37:12of what the matrix is telling you.
- 37:15So that's challenging.
- 37:19And then finally, the overall approach,
- 37:21I mean, it really is allowing us to see implementation
- 37:24as this dynamic process that shifts over times,
- 37:27and sometimes it moves backwards actually,
- 37:31so that we can use those findings to, you know,
- 37:34inform these course corrections, if you would.
- 37:37It's allowing us to share across sites
- 37:39without really delaying our project.
- 37:43We've experienced no delays whatsoever
- 37:46using Word documents instead of computer software.
- 37:51And it's very rigorous when you put
- 37:53all these techniques together.
- 37:56You know, the challenge is, for me,
- 37:58being three levels removed from the sources of data,
- 38:01that's very challenging.
- 38:03I have to continually kick things back to my team
- 38:07and to not only team qual, but the larger team
- 38:10to ensure that my insights are valid.
- 38:14And it really requires consistent engagement.
- 38:17This is not a collection of techniques
- 38:25that will allow a qualitative team lead
- 38:28to just sit back and show up at weekly meetings
- 38:32and say, "So what's going on?"
- 38:36You have to consistently engage with your data,
- 38:39you have to keep your team motivated,
- 38:41and project management is absolutely vital.
- 38:45Everyone has to be on top of things,
- 38:47because we all build, right, one after another.
- 38:51So if somebody drops the ball, somebody else has to wait
- 38:54for them to pick it back up again.
- 38:58<v Participant>Stop and see if anyone has questions?</v>
- 39:00<v ->I'm done.</v> <v Participant>Awesome.</v>
- 39:04Yeah...
- 39:11That was all right.
- 39:13(participant speaking indistinctly)
- 39:16<v Participant>Ashley, are you available?</v>
- 39:18<v ->Yep, I'm here.</v>
- 39:18Yeah, so if anyone has any questions, feel free.
- 39:21We have just five minutes, but yeah,
- 39:24we'd love to have you ask some questions
- 39:27to Tracy if you have them.
- 39:30<v ->I have a question.</v> <v ->I...</v>
- 39:32Oh, go ahead.
- 39:32<v ->This is not (indistinct).</v>
- 39:34You mentioned, Tracy, that time wasn't the primary reason
- 39:38for adopting rapid qualitative analysis.
- 39:42But for me, I've thought, like, especially
- 39:45in implementation science, where you need this information
- 39:49at the stage and the formative stage
- 39:51in terms of refining the intervention,
- 39:53and you might need it for course correction,
- 39:56that the rapid aspect
- 39:58of the qualitative analysis is very important.
- 40:01And I've worked in the past with other qualitative analysts
- 40:05in studies where when they use the traditional methods,
- 40:08they're kind of on their own timeframe
- 40:11and may take, like, three or more years
- 40:14to fully process the data and write something up.
- 40:17And by that time, the actual study might even be over.
- 40:23<v ->Absolutely, well, especially if you take, you know,</v>
- 40:26what I call slow-mo longstanding sort of approaches to it.
- 40:32For me, what I really like
- 40:37about these methods is the translation,
- 40:40the ease with which you move from analysis to translating
- 40:45your findings into something that's meaningful.
- 40:48And that's always why I keep going back to rapid methods
- 40:53and because I find them more challenging, honestly.
- 40:57And I really like that challenge.
- 41:05<v Participant>Did you wanna ask your question?</v>
- 41:09<v Fauzia>May I just say something?</v>
- 41:11I want to add a little bit more
- 41:15to this response.
- 41:16And thank you very much for this wonderful presentation.
- 41:21I mean, I'm doing this on a lot of projects,
- 41:24and sometimes what, you know,
- 41:26what you said was very striking that everybody thinks,
- 41:29"Oh, this is simple, I can do it."
- 41:31But what goes into it is, you know,
- 41:35a lot, and that rigor essentially
- 41:39is what makes those insights useful
- 41:42for the implementation process.
- 41:43So what I wanted to add to the use of rapid analysis
- 41:47and Donna's question that it is time bound,
- 41:51I think yes, it's extremely useful,
- 41:55and that's why, like you said,
- 41:56you keep going back to these methods.
- 41:59What I have experienced in my work
- 42:03is that sometimes it's the data size as well.
- 42:07It might not be urgent, but the data size
- 42:10can also dictate whether you want to use the rapid methods
- 42:13or if you want to go in much more detail.
- 42:16So just a small thing.
- 42:19But I think that also is one of the reasons.
- 42:22By the way, my name is Fauzia Malik
- 42:24and I'm a medical anthropologist part of-
- 42:26<v ->Oh, hi. (laughing)</v>
- 42:28<v Fauzia>(laughing) Part of Yale School of Public Health,</v>
- 42:33Health Policy and Management Department.
- 42:35And I absolutely loved the way you presented
- 42:40the use of data and very, very important points
- 42:45that you all brought together to make sense of this,
- 42:48you know, application of rapid analysis
- 42:51in implementation science.
- 42:53Thank you so much for that. <v ->Oh, thank you.</v>
- 42:55<v Fauzia>And thank you, Donna and Ashley,</v>
- 42:57for organizing this.
- 42:58This was a pleasure to hear.
- 43:02<v ->Great. Hopefully this will be the first of many.</v>
- 43:05Ashley, there's a question in the chat if there's time,
- 43:08but I think there might have been
- 43:09some other people speaking up as well.
- 43:15<v Ashley>If you wanna take some, we have one minute,</v>
- 43:18if you wanna take a timed question or-
- 43:20<v ->We can't even hear you, Ashley.</v>
- 43:24<v ->So for the debriefs, they are really, you know,</v>
- 43:29as the name suggests, they're very brief.
- 43:31They're 30 minutes long and they're bi-weekly.
- 43:35If you're talking about the time it takes
- 43:37for a team to do this, and this is something
- 43:40that people ask quite often, it's a heavy lift.
- 43:43We have a team of five of us.
- 43:48I am 30%, the other team members are
- 43:5330, 50 and 60%.
- 43:55It is a lot of time
- 43:59to collect and analyze all these data.
- 44:01It's not just from the debriefs, right?
- 44:03It's also the interviews with the stakeholders.
- 44:06And we have interviews with veterans coming up next week.
- 44:10So it's pricey, that's for sure.
- 44:13Especially if you get an expensive
- 44:16investigator level anthropologist on your team. (laughing)
- 44:22<v ->So everyone, thank you so much.</v>
- 44:25It was lovely to have so many people log in,
- 44:27and we really look forward to, you know,
- 44:31more sessions in the future.
- 44:32And so I hope everyone will sort of join me
- 44:34in thanking Dr. Abraham for her talk.
- 44:37Yeah, and take care.
- 44:42<v ->Bye. (voices overlapping)</v>
- 44:43<v Participant>Thank you so much.</v>
- 44:54(participants chattering indistinctly)