CTSI Annual Pilot Awards to Improve the Conduct of Research

An Open Proposal Opportunity

Action Research Program

Type: 
Proposal Status: 

Rationale The Program in Implementation Science has created a series of courses within the Training in Clinical Research Program that are designed to meet the didactic training needs of fellows and junior faculty, but lacks an experiential component.  To be effective, implementation research must accommodate the unique culture and context that define specific health care settings and communities.  An “Action Research” Program can help meet this need—which involves partnering with health care providers to improve their own practices, which in turn will enhance their working environment.  Action research’s strength lies in its focus on generating solutions to practical problems and its ability to empower health care providers, by getting them to engage with research and the subsequent development or implementation activities.  We propose an action research program to achieve at least 3 broad goals:

 

1. To design innovative strategies to improve health care delivery or public/community health in real-time in San Francisco by capitalizing on the experience and skills of an interdisciplinary team of UCSF implementation scientists... the goal of which is to improve health and care delivery in targeted settings using strategies that are designed to be patient-centered and to reduce total health care costs.

 

2. To provide a hands-on training and implementation experience for students, residents and fellows by involving them in a real-world project through all stages of development... the goal of which is to attract/encourage students to pursue careers committed to improving health care delivery, system redesign and community health programs.

 

3. To design an action research program that can be self-sustaining through reimbursements from stakeholder organizations and delivery systems who directly benefit financially from the impact of the action research program.

 

Plan

We request pilot funds for 1 year to develop and implement this program at UCSF.  It will consist broadly of 8 steps that culminate with the initial launch of a delivery system intervention. 

Step 1. Select a Partner with a “Hot Spot”. Hot spots are problem areas in clinical operations, quality or health outcomes that are identified by stakeholders (such as payors, administrators, providers or patients) as priority areas for intervention.  In our first cycle, we will focus on UCSF Medical Center ambulatory practices.  To identify potential partners, we will invite practice chiefs and administrative directors of these practices to submit brief, 1-page descriptions of Hot Spots in their practice that they would like the action research program team to help them intervene on.  Key criteria will include feasibilty of measuring and intervening on Hot Spot, engagement of clinic providers/staff, learner access to relevant data, staff, patients and providers, and degree to which addressing hot spot will help improve quality, reduce health care costs and enhance the patient experience.

Step 2.  Assemble Action Team.  Advertise volunteer/training opportunities to students and residents in medicine, pharmacy, nursing and dentistry.  Commitment of 2-4 hours per week for 4-6 months is required.  Limit to 6-8 students.  Identify key content/strategy experts from UCSF faculty and partners.

Step 3. Characterize Hot Spot with existing data sources.  Further characterize with administrative and/or medical record data to examine frequency, distribution, variability and predictors of the key process or outcome that represents the Hot Spot.

Step 4.  Conduct Literature Review of Hot Spot.  This will be performed by a combination of faculty and students.  Medical students will apply some of the principles taught in their EEBM (Epidemiology and Evidence Based Medicine) classes.

Step 5. Convene Launch Meeting and Design Workshop.  This will be a 1 or 2 day retreat in which the partner clinic (and staff) are brought together with the faculty and students on the Action Team.  The goal of the meeting will be to create a timetable with specific activities benchmarks for completing the project in a 4-6 month time frame.

Step 6. Conduct & Analyze Formative Research.  Interviews and observations will be performed to gain a greater understanding of the patient, provider, staff and system-level factors that contribute to the Hot Spot being investigated.  With close guidance from faculty, students will be charged with collecting and analyzing this data.

Step 7. Create Alpha-Version of Intervention Approach.  Combine data inputs from the literature review, the quantitative analysis of administrative/EMR data, and the formative research findings to design an intervention approach. 

Step 8. Launch First Iteration (Beta-Version) of Intervention; Collect Process Data

 

Criteria and metrics for success

  1. Submission of Hot Spot proposals from multiple clinics—shows interest/need for service.
  2. Requests to participate from multiple students/residents—shows interest/appeal to trainees.
  3. Implementation of intervention that has a significant impact (>10-20% change from baseline) on Hot Spot measure.
  4. Explicit plan for continued monitoring and refinement of intervention by participating clinic.

 

Total Budget: $70,904

Salary support for principal faculty and part-time research assistant.

 

Collaborators: Ralph Gonzales (Medicine); Margaret Handley (Epidemiology & Biostatistics); Sara Ackerman (Medicine); Joshua Adler (Chief Medical Officer; UCSF Medical Center)

Comments

Excellent idea. I suggest emphasizing even further the multidisciplinary (or at least multi-training level) team aspect of the program. Also, you might consider bringing in an expert on team-building/group dynamics to the retreat to provide the trainees with more insight into this area, and enable them to be even better prepared for the implementation of the project.

Thx. We agree, and plan for this to be a critical activity during the retreat--highlighting multiple dimensions of "teams" as you note... different learners, different disciplines.

Re: Select a Partner with a “Hot Spot”. Two weeks ago I gave presentation in two different workshops in the Netherlands. Health delivery hot spots were a topic in one. An ongoing project that achieved some success instantiated alternative agent-based models of the situation and its unique features, including agents representing providers and patients. Validated simulations revealed emergent phenomena consistent with “hot-spot” issues. Have you considered such an approach?

This sounds very interesting. Do you have any materials or references you can share?

I'll send a reply by e-mail, Ralph

I would also be interested in these articles. As we have built an agent-based mobile framework, I am curious to see how other health professionals are using agent based simulations. I would think for a proposal like this one, the inclusion of industrial and systems engineering professionals (similar to USC's approach) would round out the team.

1) Is there any way to extend your timeline? 4-6 months leaves little room for the inevitable delays in changes in plans, enough data to interpret on a run chart or by other means, and will also limit the data collection to help address the ROI. 2) It would be ideal to really encourage small tests of change early and fast. Changes that can be tested in one afternoon with 5 pts on one day, for example. Given that the program is 6 months or so long. In our 12 month program we found that even with significant amount of prep (writing aim statements, clarifying measures, insisting on reporting dates for quarterly data submission, and incentives of up to $1000 for timely data submission at the end of a year), the teams had substantial delays in getting started for a multitude of reasons. With that, they often really only had 8 months or sometimes less (due to unanticipated environmental changes such as EHR rollout delays, problems with quality of demographic or other data, etc). So, guiding and encouraging them to start testing ASAP would substantially improve the chances of getting some substantive PDSAs rolling (and therefore enabling data collection) within the 1st month or two. This will take a great deal of practice coaching, which you have planned for. For us, it took way more time than we anticipated, but your experience may be different. 3) The challenge of what is QI (rapid cycle improvement) versus an intervention. I really have QI perspective and experience and therefore am less well-versed in the IDS language, so my lack of knowledge maybe showing here. … We found that our practices did not do a single intervention to improve care, rather they did a multitude of things such as in-reach, out-reach, motivational intervening, staff training, etc. Many of these occurred simultaneously to make sure that processes worked to improve the measures. I think even in our division CRC improvement effort in required a multitude of steps—do you see the “package of steps” as the intervention? 4) The practices may also find it helpful to include balancing measures—measures that may inadvertently get worse as improvements are implemented, for example patient satisfaction or wait times, staff satisfaction, etc. 5) We found it nearly impossible to do literature reviews for all the variations that individuals brought up—the most useful for us was to have a toolbox of QI interventions (such as reminders) and in our case, equity tools (REAL data collection strategies, 1 question screen for health literacy, etc) to recommend and then focus on adapting this to the setting at hand. Also, even with our 20 sites doing largely preventive and chronic disease care, there were so many variations on a theme that we essentially had to do lit searches in real time and rarely had the leisure of being comprehensive. That said, I agree this would be ideal and perhaps your staffing plans will enable this in a way that we could not do. 6) For the in-person convening, you may find it most helpful to give them some background as you’ve described, but to primarily focus on very pragmatic issues such as confirming measures, planning their first PDSA cycle, working on team roles (who will do what, how will they work together, how to effectively communicate, etc). It may even be ideal to do some of the 1:1 practice coaching here to get things jump started.

Thanks so much Sunita! I've responded to each comment in numerical order below. 1. Our 4-6 month timeline is to implement the first small test of change. We hope to have established a monitoring/measurement system that will allow the clinical setting to continue to monitor and adapt. But i think you are right in that we should continue to play a more active role, and perhaps engage students to participate for a longer period of time. 2. We totally agree. And this will be a key part of our selection process.. making sure we pick projects that are amenable to a quick start-up. 3. I see QI and IDS as overlapping alot. One of the key distinctions, i believe, is that the IDS perspective places a greater emphasis on using formative evaluation and observational research in the intervention design; although adapting the intervention along the way is completely appropriate. Another benefit of an IDS approach is in conducting a more comprehensive/theory-driven process evaluation as strategy for continuous quality improvement (rather than typical QI PDSA cycles that rely primarily on one or two simple metrics). If we do this right, however, we hope to have a standardized approach that could be replicated by others. 4. Yes, this is a key part of our evaluation. 5. Funny you mention "toolbox" b/c we will also begin with a "translational toolbox" that we've developed through our courses. We hope to complete the literature review during the summer, before the formal launch. 6. Thanks... very pragmatic. Maybe we can get some help from the Center for Health Professions? :)

Terrific idea and one that could lead to both improvement in the care delivery system as well as increasing the cohort of health professionals trained in implementation science. I would recommend more explicit education of the HotSpot owners in microsystems work as one of the deliverables of the project so that the Hot Spot has the tools to continue to improve their environment on behalf of the patients even after the initial intervention is done. When a system understands the power of data driven improvement, they begin to see an entire set of opportunities to enhance their delivery system. This type of strategy could be facilitated by leaving teams of students in this microsystem for a longer time period.

Thank you Catherine, We will work these suggestions into our program plan. In addition, perhaps we can talk off-line about ways to include a stronger collaboration with the medical school--perhaps through Pathways or the Academy? I think leaving teams of students for longer periods of time could help to meet Sunita's concern that longer periods of time to get a full run chart going for feedback will likely be necessary.

Commenting is closed.