Proposal Overview:
Assessment of graduate medical education (GME) trainees is a high priority of training programs. Training program directors must be able to certify that graduates are able to practice independently in their respective specialty/subspecialty. Most Accreditation Council for Graduate Medical Education (ACGME) programs, including the UCSF Department of Medicine (DOM) ACGME programs, have an assessment system that is designed to report milestone assessments to the ACGME and the organization responsible for certifying individuals (such as the American Board of Internal Medicine). Milestone assessments are assigned by programs for faculty or other supervising personnel to complete and return to the program for review by Clinical Competency Committees (CCCs). The relevance of the milestones to the complexity of physicianship as well as trainee and faculty engagement with the assessment process has presented challenges to collecting, analyzing and reporting the clinical competence of trainees for DOM training programs. We seek to create a novel, learner-driven GME assessment system, using Entrustable Professional Assessments (EPAs) in a newly developed information system that seeks to better assess trainee performance while empowering trainees to actively manage their own assessment process.
Specific Aims:
1- Design and implement an information system for all Department of Medicine educational programs that will collect and display trainee assessment data with customized views to allow trainees, advisors and competency committees to effectively assess trainee performance.
2- Change the culture of trainee assessment from program-driven to trainee-driven by enabling learners to pro-actively solicit faculty assessment in areas of need, monitor their progress with meeting assessment goals and to control the release of designated assessable elements to the clinical competency committee under the guidance of their faculty advisor.
Background and Significance:
Various frameworks have evolved over the years to guide how to assess and report the competence of trainees. The transition to a philosophy of assessing trainee outcomes began in 1999 when the ACGME introduced the six core competencies (patient care, medical knowledge, systems-based practice, practice-based learning, communication and professionalism). The introduction of the Next Accreditation System (NAS) evolved these concepts to include milestones as subsections of the core competencies. The internal medicine community began with 142 educational milestones which were subsequently reduced to 22 sub-competencies embedded in the original 6 ACGME competency domains. (2)
In recognition of the limitations of the milestone-based assessment process an alternative approach to assessment uses EPAs as a framework for assessing competence. EPAs reflect the degree of mastery of professional tasks that together constitute the work of a profession. (3) Successfully performing EPAs requires abilities in multiple competencies. Using EPAs for assessment recognizes that professional tasks are complex and provides a holistic view of the successful acquisition of professional skills.
Current UCSF DOM Assessment Strategy:
In response to this rapidly changing landscape our current assessment strategy utilizes both milestone and EPA based systems to assess our trainees. Our primary modality of assessment links specific milestones to trainee rotations and are sumatively assessed by faculty utilizing the E*value platform system. Our programs may deploy different tools (global assessments, procedure observations, 360’s, chart stimulated recall, qualitative comments) to assess their program specific milestones but most programs depend upon observational methods to record the initial milestone assessment. Clinical competency committees (CCCs) then meet to determine the summative milestone score that is reported to the ACGME and the ABIM.
Currently, there are a few EPAs in use in certain programs (Internal Medicine, Geriatrics, and Hematology/Oncology) that are recorded in a Qualtrics survey platform but those data are hard to collate in a meaningful way for use by the CCC. Therefore, CCCs are largely dependent upon the milestone scores obtained from direct faculty observations, a process that has many limitations. .
Analysis of DOM Assessment Strategy:
Program CCCs are reporting issues with their milestone assessment processes that have been in effect since AY2013-14 for the internal medicine residency program and AY2014-15 for the fellowship programs. First, compliance with completing the assessments is poor; across the DOM training programs, the overall compliance of assigned evaluations is initially only 50%. Much administrative time is spent trying to get completed evaluations to the required 70% level and often program leadership has to step in to improve faculty compliance. Reasons faculty cite for their challenges in completing assessments is “assessment fatigue” caused by multiple end of the month evaluations being due simultaneously, the need to assess as many as 22 milestones and overall frustration with the E*value platform. Faculty also state that they have learned to ignore or delete automatic email assessment reminders from E*value, further complicating compliance with our evaluation process. Faculty also states that direct requests from trainees for feedback are harder to ignore due to the personal connection of the request.
Second, the quality of the assessments is also in question. CCCs are now often basing their judgments on insufficient numeric scores within each milestone category. This has led our CCC to become concerned about whether we are fully able to assess trainee’s performances. Similarly, our CCC is unable to effectively identify areas of concern for struggling trainees, preventing appropriate remediation efforts.
In an effort to promote EPA based assessment, the internal medicine residency began exploring the use of an EPA-based assessment system in 2013. In 2013-4 we utilized a Delphi method to implement a group of 8 EPAs to help make our assessments more activity specific. (Table 1) We also identified twenty-two subsequent EPAs with at least 80% participant agreement that can serve to guide our future expansion of this network. (6)
Table 1. Current EPAs housed on Qualtrics Platform
Acute Care EPA |
Clinic Post-Discharge EPA |
Code Leadership EPA |
Discharge summary EPA |
Four habits clinic survey |
Multidisciplinary Rounds Inpatient Feedback |
Rounds Observation EPA |
Serious Illness Communication |
Although these and future EPAs contain of a robust menu of assessment parameters, we have been unable to implement them in a meaningful manner due to several limitations with our assessment technology as well as our inability to create a trainee-centered interactive assessment platform. Table 2 outlines our efforts at implementation thus far.
Table 2. Timeline of EPA Implementation efforts
Platform Used | System Advantages | Problems with Implementation |
Mahara Portfolio System | Trainees log own assessment data. Captures wide range of data. Trainees release data for feedback. | Unwieldy to add data. Inability of faculty to track progress. |
MyFolio System | Potential for single data platform. Ease of expansion to new EPAs. | E*value/MyFolio platform communication. |
Qualtrics | HIPPA compliant. Easy to use, edit, summarize data. Free use for UCSF. | Data management. Unable to summarize for CCC. Lack of interconnectivity with current assessment platform. Unable to utilize quickly on handheld device. |
Vision for Future Assessment Process:
While learning about our aforementioned concerns regarding assessment amongst our faculty and CCC, we have been engaging our trainees to formulate our future assessment process. Eight focus groups with a total of approximately 100 residents and fellows have been conducted over the last three years. Several themes have emerged from these focus groups that match concerns voiced by resident trainees across the country when there is inadequate supervision or feedback. (4-5) These themes are summarized in Table 3.
Table 3. Trainee themes from assessment focus groups.
Theme #1 | Trainee assessment that depends on summative feedback from end of rotation (i.e. milestone-based assessments) lack specificity to enable trainees to improve their professional practice. Trainees would prefer more timely, qualitative, in-person and practice based feedback to help them grow professionally. |
Theme #2 | Current assessment tools create scores and comments that are difficult for residents to understand or to access to help them follow their progress in real-time. |
Theme #3 | With the advent of the Clinical Competency Committees (CCC) mandated by the ACGME, trainees feel there is a lack of transparency or control over their own assessments leading to mistrust of the assessment system. |
Theme #4 | The current culture in our residency and fellowship programs does not facilitate constant, clear and continuous feedback from faculty to trainees. Trainees feel this acutely and often feel criticized when receiving anything less than superlative feedback from faculty. |
To take an initial step of addressing these themes we began a two month pilot project where trainees actively solicit EPA activity-based feedback from faculty or other professional staff. These evaluations are completely resident driven and are collected in Qualtrics. This “proof of concept” pilot enables us to examine the rates of completion of EPAs, willingness of residents to participate in the process and their satisfaction with the feedback they receive. This pilot is currently ongoing but has been well received by trainees and evaluators alike.
Although this pilot is an exciting advance, it only fulfills our goals of facilitating timely trainee feedback. The pilot does not address our goals of creating an accessible trainee information platform where trainees are able to drive their own assessments and highlight their best work for the CCC. It also only takes a small incremental step towards moving our assessment and feedback culture forward to be more open and meaningful.
Proposal Goals:
We plan to create an assessment system that addresses our CCC concerns of obtaining adequate assessment data for trainee promotion, trainee concerns of transparency, timeliness and quality of their assessments as well as faculty frustrations of “evaluation burn-out”. Our proposal is to accomplish this by expanding EPAs as well as creating a user friendly trainee landing page to consolidate assessment data.
Goal 1- To create a trainee “landing page” that enables residents to review their summative and activity-driven (EPA) assessment data with their advisors to facilitate understanding of their assessment data.
Figure 1- Mock trainee landing page.
Goal 2- Give trainees more control over their assessment data that is presented to the CCC for entrustment. Figure 2 demonstrates assessment modalities immediately available to the CCC. Assessment data in red is available to the trainee and their advisor. Through joint discussion they can release their best work for “entrustment” by the CCC. For each EPA, trainees must release three “entrustable” evaluations done by three separate faculty members for evaluation by the CCC.
Figure 2- Types of data available to trainees, advisors and CCC
Goal #3- Improve timeliness, quality and specificity of assessment information for trainees while reducing evaluation burden for faculty by creating streamlined EPA assessment forms and handheld app to facilitate immediate completion. The assessment editing will be driven by ongoing feedback from trainees and faculty from the current pilot project.
Goal #4- Utilize collected data from new information system to become more transparent about CCC deliberations with trainees. With improved assessment data/information system the CCC will generate automatic entrustment emails for EPAs/procedures when trainees are signed off. Similarly, we will be able to target earlier and more focused development efforts for residents who are not reaching their milestones appropriately.
Team Members:
Jeff Kohlwes MD, MPH. School of Medicine. Department of Internal Medicine. Clinical Professor of Medicine. Director, PRIME Residency Program, Associate Program Director for Assessment and Feedback Internal Medicine Residency Program.
Patricia Cornett MD. School of Medicine. Department of Internal Medicine. Clinical Professor of Medicine. Associate Chair for Education, Department of Internal Medicine.
Vanessa Thompson MD. School of Medicine. Department of Internal Medicine. Assistant Clinical Professor of Medicine. General Medicine Clinic Assistant Medical Director San Francisco General Hospital, Internal Medicine Residency Director of Academic Development
Sumant Ranji MD. School of Medicine. Department of Internal Medicine. Associate Program Director for Quality and Safety. Internal Medicine Residency Program.
Irina Kryzhanovskaya MD. School of Medicine. Department of Internal Medicine. Senior Resident in Internal Medicine, Chief Resident-Elect.
Ray Chen MD. School of Medicine. Department of Internal Medicine. Junior Resident in Internal Medicine
Commenting is closed.
Comments
This idea is much needed on a
This idea is much needed on a GME level. I could use it as a program director for a fellowship program at UCSF.
Thanks Eric! Our goal is to
Thanks Eric! Our goal is to make this available to all IM trainees- residents and fellows of all subspecialties (as well as key educational faculty). Jeff
Empowering residents, and
Empowering residents, and supporting them with a user-friendly platform, to solicit constructive feedback would be a major step forward for the residency program. The literature around feedback suggests that when individuals actively seek feedback, as opposed to when they are given feedback without proper readiness to receive it, they are much more likely to to find that feedback helpful and to incorporate it into their practice. Giving residents more ownership over soliciting feedback also has the potential to help us transform our culture into one in which learners and faculty alike are open to and constantly seeking opportunities for self-reflection and self-improvement.
Denise- thanks for your
Denise- thanks for your comment- I agree that if we can engage the trainees that we will improve not only the in the moment feedback- but over time- we can change the culture around feedback to where it becomes an ongoing give and take that facilitates learning amongst all our trainees! jeff
This system would be a
This system would be a tremendous asset for the UC training programs. Despite numerous attempts at improvement and general acknowledgment of the great importance of feedback, current evaluations are often delayed, incomplete, and vague. Allowing learners to initiate the evaluation process will enable them to obtain feedback that is more relevant and to their needs. I also believe that the personal and specific nature of this feedback process will better engage faculty, yielding more timely completion and detailed comments. This is a win-win for trainees and educators!
Beth- thank you so much for
Beth- thank you so much for your thoughtful comments- i could not agree more- we have to fix our ability to give real time feedback to our trainees. They want it and it will improve the care that our patients get by enabling the trainees to grow. Part of our plan will really focus on the second part of your comment- we need to design more faculty education to improve their skills at feedback so they can make valuable comments when the trainees ask. Thanks again! jeff
I love the idea (and am not
I love the idea (and am not impressed with eValue). After a trail in DOM, please work to use this for other services (like ObGyn).
Thanks for your support
Thanks for your support Leslee- actually that is hopefully the plan. The goal is to create something that would be valuable for all GME programs as the challenges are so similar for all our residency (and fellowship) programs. We hope to create an interactive function that will allow for residents to post about clinical encounters and then get direct feedback from faculty members (or other involved staff). This could be especially valuable for procedural specialties as residents could post additions to their procedure logs and then get very rapid electronic feedback from their advisors! Thanks for reading our proposal!
This will be a valuable
This will be a valuable trainee tool. What will be important is to make the user-interface easy, number of steps minimal, and quick access to the platform for both trainees and teachers.
Will teachers also be evaluated for their performance? Although that may not be part of ACGME, there is the value of feedback and promotions. How will this platform interface with E-value and if not, what is the incentive/mandate for this platform to be used?
I am engaged with evaluating the training and trainee experience for proceduralist so will keep an eye on the development of your platform with much interest. Best of luck!
Ma- Thanks for your comments!
Ma- Thanks for your comments! This will interface with Evalue with the goal of being a one step portal to all assessment modalities for the residents and fellows. Initially the plan is to give each trainee and entry page and then expand it to the faculty who frequently assess them- our hope is that this will streamline the time and effort of assessment while making it much for specific and useful for the trainees.